I am try to configure Sphinx 2.1.6. When I enter word - I get some result, but when I don't do that or searching only with filters - I have error :
fullscan requires extern docinfo.
PHP:
require_once('/usr/share/sphinx/api/sphinxapi.php');//Include the Sphinx PHP API
$cl = new SphinxClient();//Start the SphinxClient class
$cl->SetMatchMode(SPH_MATCH_EXTENDED2);//Match all words or any word?
$cl->SetSortMode(SPH_SORT_RELEVANCE);//
$cl->setLimits(0,10);//Works like MySQL LIMIT
$searchWord = "*";
if(!empty($_POST['searchData'])) {
$searchWord = trim($_POST['searchData']);
}
// slider salary
//$firePHP -> log($boo);
if(!empty($_POST['min_salary'])){
$min_salary = (int)$_POST['min_salary'];
$max_salary = (int)$_POST['max_salary'];
$exclude = false;
$cl->SetFilterRange('salary', $min_salary, $max_salary, $exclude);
}
// filter select city []
if(!empty($_POST['cities'])) {
$city = $_POST['cities'];
$exclude = false;
$cl->SetFilter('city', $city, $exclude);
}
Sphinx config:
sql_query = \
SELECT id, occupation, experience, education, branch,
typeVacancy, salary, description, city, employer\
FROM vacancy_view
sql_attr_uint = salary
sql_attr_uint = city
You must have docinfo=inline configured on your index
http://sphinxsearch.com/docs/current.html#conf-docinfo
need docinfo=extern (which happens to be the default, so must of changed it!) for full-scans to work.
Related
I am able to create a azurerm_postgresql_flexible_server and azurerm_postgresql_flexible_server_database using terraform.
I am not able to create a schema using TF but not able to get much help on documentation.
I also checked https://registry.terraform.io/providers/cyrilgdn/postgresql/latest/docs/resources/postgresql_schema
but that uses a different provider. I am not sure what am I missing here.
This is the TF template which creates the Azure PostgreSQL server and DB -
module "common_modules" {
source = "../modules/Main"
}
provider "azurerm" {
features {}
}
locals {
#Construct Tag Data for Resource
resourceTags = {
environment = var.environment
createdBy = var.createdBy
managedBy = var.managedBy
colorBand = var.colorBand
purpose = var.purpose
lastUpdateOn = formatdate("DD-MM-YYYY hh:mm:ss ZZZ", timestamp())
}
}
resource "azurerm_postgresql_flexible_server" "postgreSQL" {
name = var.postgreSQL
location = var.location
resource_group_name = var.ckeditorResorceGroup
administrator_login = var.postgreSQLAdmin
administrator_password = var.password
sku_name = "B_Standard_B1ms"
version = "13"
storage_mb = 32768
backup_retention_days = 7
geo_redundant_backup_enabled = false
tags = local.resourceTags
}
resource "azurerm_postgresql_flexible_server_database" "postgreSQLDB" {
name = var.postgreSQLDB
server_id = azurerm_postgresql_flexible_server.postgreSQL.id
collation = "en_US.utf8"
charset = "utf8"
}
resource "azurerm_postgresql_flexible_server_firewall_rule" "postgreSQLFirewallRule" {
name = "allow_access_to_azure_services"
server_id = azurerm_postgresql_flexible_server.postgreSQL.id
start_ip_address = "0.0.0.0"
end_ip_address = "0.0.0.0"
}
have a look at https://registry.terraform.io/providers/cyrilgdn/postgresql or https://github.com/cyrilgdn/terraform-provider-postgresql
usable, but you need network connectivity to resolve names (azure private dns zone) and to connect with postgresql flexible server. The terraform code should run in same vnet like flexi server.
I provisioned VM with following C# snippet
var ssrsVm = new WindowsVirtualMachine("vmssrs001", new WindowsVirtualMachineArgs
{
Name = "vmssrs001",
ResourceGroupName = resourceGroup.Name,
NetworkInterfaceIds = { nic.Id },
Size = "Standard_B1ms",
AdminUsername = ssrsLogin,
AdminPassword = ssrsPassword,
SourceImageReference = new WindowsVirtualMachineSourceImageReferenceArgs
{
Publisher = "microsoftpowerbi",
Offer = "ssrs-2016",
Sku = "dev-rs-only",
Version = "latest"
},
OsDisk = new WindowsVirtualMachineOsDiskArgs
{
Name = "vmssrs001disk",
Caching = "ReadWrite",
DiskSizeGb = 200,
StorageAccountType = "Standard_LRS",
}
});
After VM has been provisioned I would like to run a custom Powershell script on it to add a firewall rule. Now wondering how to do this as a part of the Pulumi app.
With Azure looks like I could do this with RunPowerShellScript but couldn't find anything about it in Pulumi docs, maybe there is a better way to handle my case?
UPDATE
Thanks to Ash's comment I was able to find VirtualMachineRunCommandByVirtualMachine which seems should do what I'm looking for, but unfortunately, following code snippet returns error
var virtualMachineRunCommandByVirtualMachine = new VirtualMachineRunCommandByVirtualMachine("vmssrs001-script",
new VirtualMachineRunCommandByVirtualMachineArgs
{
ResourceGroupName = resourceGroup.Name,
VmName = ssrsVm.Name,
RunAsUser = ssrsLogin,
RunAsPassword = ssrsPassword,
RunCommandName = "enable firewall rule for ssrs",
Source = new VirtualMachineRunCommandScriptSourceArgs
{
Script =
#"Firewall AllowHttpForSSRS
{
Name = 'AllowHTTPForSSRS'
DisplayName = 'AllowHTTPForSSRS'
Group = 'PT Rule Group'
Ensure = 'Present'
Enabled = 'True'
Profile = 'Public'
Direction = 'Inbound'
LocalPort = ('80')
Protocol = 'TCP'
Description = 'Firewall Rule for SSRS HTTP'
}"
}
});
error
The property 'runCommands' is not valid because the 'Microsoft .Compute/RunCommandPreview' feature is not enabled for this subscription."
Looks like other people are struggling with the same here.
You can use a Compute Extension to execute a script against a VM with Pulumi.
This article details some of the options if you just completed the procedure via PowerShell.
As an addition to Ash answer here is how I integrated it with Pulumi
first, I create a blob container for my project scripts
var deploymentContainer = new BlobContainer("deploymentscripts", new BlobContainerArgs
{
ContainerName = "deploymentscripts",
ResourceGroupName = resourceGroup.Name,
AccountName = storageAccount.Name,
});
next, I upload all of my Powershell scripts to create blob
with this snippet
foreach (var file in Directory.EnumerateFiles(Path.Combine(Environment.CurrentDirectory, "Scripts")))
{
var fileName = Path.GetFileName(file);
var blob = new Blob(fileName, new BlobArgs
{
ResourceGroupName = resourceGroup.Name,
AccountName = storageAccount.Name,
ContainerName = deploymentContainer.Name,
Source = new FileAsset(file),
});
deploymentFiles[fileName] = SignedBlobReadUrl(blob, deploymentContainer, storageAccount, resourceGroup);
}
SignedBlobReadUrl grabbed from Pulumi repo.
private static Output<string> SignedBlobReadUrl(Blob blob, BlobContainer container, StorageAccount account, ResourceGroup resourceGroup)
{
return Output.Tuple<string, string, string, string>(
blob.Name, container.Name, account.Name, resourceGroup.Name).Apply(t =>
{
(string blobName, string containerName, string accountName, string resourceGroupName) = t;
var blobSAS = ListStorageAccountServiceSAS.InvokeAsync(new ListStorageAccountServiceSASArgs
{
AccountName = accountName,
Protocols = HttpProtocol.Https,
SharedAccessStartTime = "2021-01-01",
SharedAccessExpiryTime = "2030-01-01",
Resource = SignedResource.C,
ResourceGroupName = resourceGroupName,
Permissions = Permissions.R,
CanonicalizedResource = "/blob/" + accountName + "/" + containerName,
CacheControl = "max-age=5",
});
return Output.Format($"https://{accountName}.blob.core.windows.net/{containerName}/{blobName}?{blobSAS.Result.ServiceSasToken}");
});
}
and lastly, I create Extension to run my script
code
var extension = new Extension("ssrsvmscript", new Pulumi.Azure.Compute.ExtensionArgs
{
Name = "ssrsvmscript",
VirtualMachineId = ssrsVm.Id,
Publisher = "Microsoft.Compute",
Type = "CustomScriptExtension",
TypeHandlerVersion = "1.10",
Settings = deploymentFiles["ssrsvm.ps1"].Apply(script => #" {
""commandToExecute"": ""powershell -ExecutionPolicy Unrestricted -File ssrsvm.ps1"",
""fileUris"": [" + "\"" + script + "\"" + "]}")
});
Hope that will save some time someone else struggling with the problem.
I have the following model:
class ServerSimpleConfigSerializer(mixins.GetCSConfigMixin, serializers.ModelSerializer):
mp_autoteambalance = serializers.BooleanField(label='Auto Team Balance', default=True, required=False)
mp_friendlyfire = serializers.BooleanField(label='Friendly Fire', default=False, required=False)
mp_autokick = serializers.BooleanField(label='Auto team-killer banning and idle client kicking', default=True, required=False)
# hostname = serializers.CharField(label='Hostname', max_length=75, required=True)
rcon_password = serializers.CharField(label='RCON Password', max_length=75, required=True)
sv_password = serializers.CharField(label='Server Password', max_length=75, required=False)
mp_startmoney = serializers.IntegerField(label='Start Money', required=False, validators=[MinValueValidator(800), MaxValueValidator(16000)])
mp_roundtime = serializers.FloatField(label='Round Time', required=False)
mp_timelimit = serializers.IntegerField(label='Map Time Limit', required=False)
fpath = os.path.join(settings.ROOT_DIR, "cs16/tmp/server.json")
class Meta:
model = CS16Server
fields = ('name', 'game_config', 'mp_autoteambalance', 'mp_friendlyfire',
'mp_autokick', 'hostname', 'rcon_password', 'sv_password',
'mp_startmoney', 'mp_roundtime', 'mp_timelimit')
read_only_fields = ('name', 'game_config',)
The model has the following fields:
name, game_config (big text) and hostname
How can I return the above defined fields for the serializer, although they are not present on the model ?
I would like to set some custom values for each field & return them as a JSON.
Is that possible ?
Actually the values for the above defined fields are found in "game_config" field.
I would like to parse those values & return them & I would not want to put them as separate fields in the model.
Parse game_config, obtain a pair of: (field0, val0) ... (fieldN, valN) and in the serializer,
set those values for the serializer fields.
For now I only get the following response:
{
"name": "Chronos",
"game_config": "hostname \"A New Gameservers.com Server is Born\"\nrcon_password \"\"\nsv_password \"1410271\"\nsv_contact email#domain.com\nsv_region 255\nsv_filterban 1\nsv_logbans 0\nsv_unlag 1\nmp_startmoney 800\nmp_chattime 30\nmp_footsteps 1\nsv_footsteps 1\nmp_logdetail 0\nmp_logmessages 0\nmp_timelimit 30\nmp_autokick 1\nmp_autoteambalance 1\nmp_flashlight 0\nmp_forcerespawn 0\nmp_forcechasecam 0\nmp_freezetime 0\nmp_friendlyfire 0\nmp_hostagepenalty 0\nmp_limitteams 0\nmp_roundtime 5\nmp_tkpunish 1\nsv_voiceenable 1\nsv_voicecodec voice_speex\nsv_voicequality 3\nsv_alltalk 0\nsv_restartround 1\nsv_maxspeed 320\nsv_proxies 1\nallow_spectators 1\nsv_allowupload 1\npausable 0\ndecalfrequency 40\nmp_falldamage 0\nsv_cheats 0\nsv_lan 0\nsv_maxrate 20000\nsv_minrate 4000\nexec listip.cfg",
"mp_autoteambalance": true,
"mp_friendlyfire": false,
"mp_autokick": true,
"hostname": "none"
}
I'm testing django signal to send an email but I'm getting the following error.
'list' object has no attribute 'splitlines'
#receiver(post_save, sender=Booking)
def new_booking(sender, instance, **kwargs):
if instance.firstname:
firstname = [instance.firstname]
# lastname = [instance.lastname]
email = [instance.email]
# phone = [instance.phone]
subject = [instance.service]
# date = [instance.date]
# time = [instance.time]
# fullname = [firstname + lastname]
# details = [service]
send_mail(firstname, subject, email,
['cmadiam#abc.com'], fail_silently=False)
Do i miss something?
Thanks again!
Got this working... if someone needs it... here's the code...
from .models import Booking
#receiver(post_save, sender=Booking)
def new_booking(sender, instance, **kwargs):
if instance.firstname:
firstname = (instance.firstname)
email = (instance.email)
subject = (instance.service)
send_mail(firstname, subject, email,
['cmadiam#abc.com'], fail_silently=False)
I am trying to pass the PowerShell script as the file IIS.txt which is present in the CWD.
I don't see the script running on the server. I am not sure if I am missing something. Any help would be appreciated.
resource "aws_instance" "db1" {
ami = "ami-1234567890"
instance_type = "t3.small"
subnet_id = "${aws_subnet.db.0.id}"
key_name = "ireland"
user_data = "${file("IIS.txt")}"
tags = {
Name = "sql node 1"
}
}
I've used a template_file data and local_file resource for this.
data "template_file" "user_data" {
template = "${file("iis.txt")}"
}
resource "local_file" "user_data" {
content = "${data.template_file.user_data.rendered}"
filename = "user_data-${sha1(data.template_file.user_data.rendered)}.ps"
}
Then update your user_data property content of the local_file resource.
resource "aws_instance" "db1"
{
ami = "ami-1234567890"
instance_type = "t3.small"
subnet_id = "${aws_subnet.db.0.id}"
key_name = "ireland"
user_data = "${local_file.user_data.content}"
tags =
{
Name = "sql node 1"
}
}
This also allows you to get a little fancier and do a template script, and pull TF variables, etc into the template and render it just in time before you deploy.