Vertica-Tableau error Multiple commands cannot be active - tableau-api

We have dataset in Vertica and Tableau is querying the data (4 Billions record) from vertica for dashboard as shown below :
All list and graphs are separate worksheets in tableau and using same connection to Vertica DB. Each list is a column in DB and list is descending order of # count of items in dataset's respective column. Graph also same as list but calculated in slightly different manner. Start Date and End Date is date range for Data to be query like data connection filter which will restrict the query to fixed amount of data example past week, last month, etc.
But I get this ERROR :
Vertica][VerticaDSII] (10) An error occurred during query preparation: Multiple commands cannot be active on the same connection. Consider increasing ResultBufferSize or fetching all results before initiating another command.
Is the any workaround this issue or any better way to do this

you'll need a TDC file which specifies a particular ODBC connection string option to get around the issue.
The guidance from Vertica was to add an ODBC Connect String parameter with the value “ResultBufferSize=0“. This apparently forces the result buffer to be unlimited, preventing the error. This is simple enough to accomplish when building a connection string manually or working with a DSN, but Vertica is one of Tableau’s native connectors. So how do you tell the native connector to do something else with its connection?
Native Connections in Tableau can be customized using TDC files
“Native connectors” still connect through the vendor’s ODBC drivers, and can be customized just the same as an “Other Databases” / ODBC connection. In the TDC files themselves, “ODBC” connections are referred to as “Generic ODBC”, which is a much more accurate way to think about the difference.
The full guide to TDC customizations, with all of the options, is available here although it is pretty dense reading. One thing that isn’t provided is an example of customizing a “native connector”. The basic structure of a TDC file is this
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='genericodbc' enabled='true' version='7.7'>
<vendor name='' />
<driver name='' />
<customizations>
</customizations>
</connection-customization>
When using “Generic ODBC”, the class is “genericodbc” and then the vendor and driver name must be specified so that Tableau can know when the TDC file should be applied. It’s much simpler for a native connector — you just use the native connector name in all three places. The big list of native connector names is at the end of this article. Luckily for us, Vertica is simply referred to as “vertica”. So our Vertica TDC framework will look like:
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='vertica' enabled='true' version='7.7'>
<vendor name='vertica' />
<driver name='vertica' />
<customizations>
</customizations>
</connection-customization>
This is a good start, but we need some actual customization tags to cause anything to happen. Per the documentation, to add additional elements to the ODBC connection string, we use a tag named ‘odbc-connect-string-extras‘. This would look like
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;' />
One important thing we discovered was that all ODBC connection extras need to be in this single tag. Because we wanted to turn on load balancing in the Vertica cluster, there was a second parameter recommended: ConnectionLoadBalance=1. To get both of these parameters in place, the correct way method is
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;ConnectionLoadBalance=1;' />
There are a whole set of other customizations you can put in to place to see how they affect performance. Make sure you understand the way the customization option is worded — if it starts with ‘SUPRESS’ then giving a ‘yes’ value will turn off the feature; other times you want to set the value to ‘no’ to turn off the feature. Some of the other ones we tried were
<customization name='CAP_SUPPRESS_DISCOVERY_QUERIES' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_PREPARED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_SELECT_STAR' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_EXECUTED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPRESS_SQLSTATISTICS_API' value='yes' />
<customization name= 'CAP_CREATE_TEMP_TABLES' value='no' />
<customization name= 'CAP_SELECT_INTO' value='no' />
<customization name= 'CAP_SELECT_TOP_INTO' value='no' />
The first set were mostly about reducing the number of queries for metadata detection, while the second set tell Tableau not to use TEMP tables.
The best way to see the results of these customizations is to change the TDC file and restart Tableau Desktop Once you are satisfied with the changes, then move the TDC file to your Tableau Server and restart it.
Where to put the TDC files
Per the documentation ”
For Tableau Desktop on Windows: Documents\My Tableau Repository\Datasources
For Tableau Server: Program Files\Tableau\Tableau Server\\bin
Note: The file must be saved using a .tdc extension, but the name does not matter.”
If you are running a Tableau Server cluster, the .tdc file must be placed on every worker node in the bin folder so that the vizqlserver process can find it. I’ve also highlighted the biggest issue of all — you should edit these using a real text editor like Notepad++ or SublimeText rather than Notepad, because Notepad likes to save things with a hidden .TXT ending, and the TDC file will only be recognized if the ending is really .tdc, not .tdc.txt.

Restarting the taableau resolved my issue which was giving same error.

Related

Add directory to Kodi using the command line

Is there possible to add a directory to Kodi using the command line? I've been looking for this with no luck so far.
What I'm looking for is to automate the process of adding a directory manually using command line. For some reason this doesn't appear to be a popular question out there; am I missing something?
Crawling the Kodi/XBMC forums and wiki show a few options... Here's what I've gathered...
Edit the Database Directly (not recommended)
Kodi stores this information in a sqlite database, however this location would be pretty tricky to manipulate yourself as it would require both knowledge of the path of each sqlite database file as well as the relationship of each column/table in each database file (assuming it's a strictly relational database file, which most are).
For example:
sqlite3 <path_to_kodi_preferences>/userdata/Database/MyVideos119.db
sqlite> .tables
actor movie_view studio_link
actor_link movielinktvshow tag
art musicvideo tag_link
bookmark musicvideo_view tvshow
country path tvshow_view
country_link rating tvshowcounts
director_link season_view tvshowlinkpath
episode seasons tvshowlinkpath_minview
episode_view sets uniqueid
files settings version
genre stacktimes writer_link
genre_link streamdetails
movie studio
Edit sources.xml
The official wiki mentions <path_to_kodi_preferences>/userdata/sources.xml for this but it still assumes you know how to manipulate an XML file programmatically and the community warns that this is potentially "invasive" and that the official addons/plugins aren't allowed to use this technique.
I dove into this and the XML seems like the way to go, for example, to add Videos:
<video>
<default pathversion="1"></default>
<source>
<name>Movies</name>
<path pathversion="1">/home/ubuntu/Movies/</path>
<allowsharing>true</allowsharing>
</source>
<source>
<name>Video Playlists</name>
<path pathversion="1">special://videoplaylists/</path>
<allowsharing>true</allowsharing>
</source>
+ <source>
+ <name>MyCustomDirectory</name>
+ <path pathversion="1">/home/ubuntu/MyCustomDirectory/</path>
+ <allowsharing>true</allowsharing>
+ </source>
</video>
... however comments suggest Kodi needs to be restarted and that this location still needs to be crawled/refreshed. There may be some "watchdog" add-ons that can do this for you.
Use the Add-On API
Another technique is to use the official Python API, such as through UpdateLibrary(database, path) however examples usually involve Python to call the API directly. Here's an example from the PlexKodiConnect GitHub project:
# Make sure Kodi knows we wiped the databases
xbmc.executebuiltin('UpdateLibrary(video)')
if utils.settings('enableMusic') == 'true':
xbmc.executebuiltin('UpdateLibrary(music)')
Since the simplest solution is often the best, I would recommend working on a way to automate the settings.xml file. Modifying XML files is well-documented in nearly all programming languages (to that point, you could technically brute-force and just inject an XML string at the given place without an xml parser 😈) and then handle the restart and refresh operations once the XML is confirmed as being updated properly.

Is there an programmatic way to create a custom network profile in Windows without importing XML file?

I'm looking for a way to add a network profile to a system without importing the XML file. Does anyone know the actual syntax to enter in all of that data manually if you have it?
Below is an example of the XML file.
Of course I could use this first snippet to do it, but I want to be able to manually.
Is someone familiar with this process?
I want to write a powershell script that inputs each of the necessary variables one by one.
netsh wlan add profile filename="C:\path\HOME.xml"
<?xml version="1.0"?>
<WLANProfile xmlns="http://www.microsoft.com/networking/WLAN/profile/v1">
<name>HOME</name>
<SSIDConfig>
<SSID>
<hex>6D797374726F</hex>
<name>mystro</name>
</SSID>
</SSIDConfig>
<connectionType>ESS</connectionType>
<connectionMode>manual</connectionMode>
<MSM>
<security>
<authEncryption>
<authentication>WPA2PSK</authentication>
<encryption>AES</encryption>
<useOneX>false</useOneX>
</authEncryption>
<sharedKey>
<keyType>passPhrase</keyType>
<protected>false</protected>
<keyMaterial>password</keyMaterial>
</sharedKey>
</security>
</MSM>
<MacRandomization xmlns="http://www.microsoft.com/networking/WLAN/profile/v3">
<enableRandomization>false</enableRandomization>
</MacRandomization>
</WLANProfile>
The answer was provided above in the form of a comment so I cannot accept it as an answer so I leave it here for anyone else that searches this. Apparently an EMF file is requirerd and manually programming in the variables is not possible at this time.
Microsoft defines it as it is requiring XML - https://learn.microsoft.com/en-us/windows/win32/nativewifi/wlan-profileschema-wlanprofile-element

Mirth -> HL7 into XML conversion Ques

I'm new to the Mirth Connect When I tried to convert HL7 into XML I 'm struggling.Suppose my HL7 messages have repeat segments like ORC in ORM Messages how to iterate that.
below is my code:
tmp['Messages']['orderList']['order'][count]['provider']=msg['ORC'][count]['ORC.10']['ORC.10.1'].toString();
but it is throwing an error:
`TypeError: Cannot read property "provider"` from undefined.
please help me to proceed further.
It's failing because your count is higher than the number of elements returned by tmp['Messages']['orderList']['order'], so it is returning undefined. The short answer is that you need to add another order node to tmp['Messages']['orderList'] before you can access it. It's hard to say how best to do that without seeing more of your code, requirements, outbound template, etc... Most frequently I build the node first, and then use appendChild to add it.
A simple example would be:
var tmp = <xml>
<Messages>
<orderList />
</Messages>
</xml>;
var prov = 12345;
var nextOrder = <order>
<provider>{prov}</provider>
</order>;
tmp.Messages.orderList.appendChild(nextOrder);
After which, tmp will look like:
<xml>
<Messages>
<orderList>
<order>
<provider>12345</provider>
</order>
</orderList>
</Messages>
</xml>
The technology you are using to work with xml is called e4x, and it's running on the Mozilla Rhino Javascript engine. Here are a couple resources that might help you.
https://web.archive.org/web/20181120184304/https://wso2.com/project/mashup/0.2/docs/e4xquickstart.html
https://developer.mozilla.org/en-US/docs/Archive/Web/E4X/Processing_XML_with_E4X

Populating opcua address space with Nodes from an xml schema

Am working on a project to build an opc ua server from specification,
I've gone far enough on the implementation, am currently working on the write request, I already have a few nodes in the server address space.
There seem to be so many nodes, so many actually. It's almost impossible to create
and add the Nodes one by one.
Anyways back to the question, I've downloaded an xml file from opc foundation containing schema for all the nodes in the address space, Here is a link to the xml file
What is the most efficient way to create nodes from the xml file ? I am writing on a c95 compiler.
Below is a quick view of how Nodes are represented in the nodeset Xml file,
<Nodes>
<Node i:type="DataTypeNode">
<NodeId>
<Identifier>i=1</Identifier>
</NodeId>
<NodeClass>DataType_64</NodeClass>
<BrowseName>
<NamespaceIndex>0</NamespaceIndex>
<Name>Boolean</Name>
</BrowseName>
<DisplayName>
<Locale></Locale>
<Text>Boolean</Text>
</DisplayName>
<Description>
<Locale></Locale>
<Text>Describes a value that is either TRUE or FALSE.</Text>
</Description>
<WriteMask>0</WriteMask>
<UserWriteMask>0</UserWriteMask>
<RolePermissions />
<UserRolePermissions />
<AccessRestrictions>0</AccessRestrictions>
<References>
<ReferenceNode>
<ReferenceTypeId>
<Identifier>i=45</Identifier>
</ReferenceTypeId>
<IsInverse>true</IsInverse>
<TargetId>
<Identifier>i=24</Identifier>
</TargetId>
</ReferenceNode>
</References>
<IsAbstract>false</IsAbstract>
<DataTypeDefinition i:nil="true" />
</Node>
Programatically filling a running OPC-UA server with nodes is unacceptably slow.
you may want to investigate the ModelCompiler.
I found it fairly straightforward to fill a modeldesign XML with data and generate code and NodeSet2.xml. So even if you have no need for the generated C# code, which I suspect to be your case, this approach may be useful.
You may also want to look at the UA-.NETStandard repository.
It offers a method LoadFromXML method that reads your nodeset pretty quickly. You may find inspiration in this method.
Bon courage et un grand merci pour vos contributions au monde OPC-UA.
Maybe I'm a bit late, but I answer if it can help someone.
If you are using C/C++ with open62541 SDK I found that it is possible to generate *.c and *.h files to include in your opcua server, as described with some examples here: you only need to run a python program, providing some parameters and the name of output files to be generated, then include these files in your opcua server.
Another way that I found is using UaModeler by Unified Automation, in that case you can generate source files to include in your project, drawing your information model in the program and exporting it to xml or source files.

Remote stream multiple files in SOLR

I want to use SOLR's remote-streaming facility to extract and index the content of files.
This works fine if I pass stream.file=xxx as a parameter to the http GET method.
However, I have a lot of these, and want to batch them up (i.e. not have to have a GET per file).
Is there a way I can do this in SOLR?
e.g. I'd like to be able to POST some xml like this:
<add>
<doc stream_file="filename">
<field name="id">123</field>
</doc>
<doc>...
This has been recently asked (and answered) in the solr-user mailing list.
I find that multiple ADDs are fast, so long as you only COMMIT the batch and don't try to COMMIT after every ADD. I would guess that the performance penalty is not worth writing your own RequestHandler.