I am implementing a VANET routing protocol using OMNeT++ (INET Framework), sumo and veins (to generate traffic).
Is it possible to add a configuration in the omnetpp.ini to run the simulation with no vehicles in order to test an application for the RSUs?
If you don't want any traffic in your simulation you will have to modify the .rou.xml file which is used for your simulation.
In your example directory you probably have a file with such an extension, simply comment out the code. Or leave only two vehicles as it is in this case:
<?xml version="1.0" encoding="UTF-8"?>
<routes xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://sumo.dlr.de/xsd/routes_file.xsd">
<vType accel="5.0" decel="5.0" id="Car" length="5.0" minGap="2.0" maxSpeed="50.0" sigma="0" />
<route id="route0" edges="1to2 out" />
<vehicle depart="1" id="veh0" route="route0" type="Car" />
<vehicle depart="1" id="veh1" route="route0" type="Car" />
</routes>
On the other hand it is a bit weird to me how would you test the application of the RSU if there are no cars. Meaning there is no transmission, meaning the RSU application reacts to what?
For a minimal example maybe you should use only one car in your scenario which sends a packet to the RSU.
Related
I'm looking for a way to add a network profile to a system without importing the XML file. Does anyone know the actual syntax to enter in all of that data manually if you have it?
Below is an example of the XML file.
Of course I could use this first snippet to do it, but I want to be able to manually.
Is someone familiar with this process?
I want to write a powershell script that inputs each of the necessary variables one by one.
netsh wlan add profile filename="C:\path\HOME.xml"
<?xml version="1.0"?>
<WLANProfile xmlns="http://www.microsoft.com/networking/WLAN/profile/v1">
<name>HOME</name>
<SSIDConfig>
<SSID>
<hex>6D797374726F</hex>
<name>mystro</name>
</SSID>
</SSIDConfig>
<connectionType>ESS</connectionType>
<connectionMode>manual</connectionMode>
<MSM>
<security>
<authEncryption>
<authentication>WPA2PSK</authentication>
<encryption>AES</encryption>
<useOneX>false</useOneX>
</authEncryption>
<sharedKey>
<keyType>passPhrase</keyType>
<protected>false</protected>
<keyMaterial>password</keyMaterial>
</sharedKey>
</security>
</MSM>
<MacRandomization xmlns="http://www.microsoft.com/networking/WLAN/profile/v3">
<enableRandomization>false</enableRandomization>
</MacRandomization>
</WLANProfile>
The answer was provided above in the form of a comment so I cannot accept it as an answer so I leave it here for anyone else that searches this. Apparently an EMF file is requirerd and manually programming in the variables is not possible at this time.
Microsoft defines it as it is requiring XML - https://learn.microsoft.com/en-us/windows/win32/nativewifi/wlan-profileschema-wlanprofile-element
Could somebody help poor developer with upgrading to Dita 1.3 :)
I need to make dita-ot work with newer version of xml's I was given (example below). I need to adjust something in the library but I don't have any clue where to start. I've replaced the problematic bit just for example - //FOOBAR/
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE concept PUBLIC "-//FOOBAR//DTD DITA Concept//EN" "file:///D:/InfoShare/Web/Author/ASP/DocTypes/dita-sdl/1.3/dtd/technicalContent/dtd/sdlConcept.dtd">
<?ish ishref="GUID-874B737D-F63A-48C3-887A-571C38D5ED5A" version="1" lang="en-us"?>
<concept xml:lang="en-us" id="xs_help_me_contextually_please" rev="for Desktop" product="Foobar product">
<title id="GUID-F92ED443-BE97-44C7-AB36-726B2A76ECF9">New DITA declaration topic without any new elements</title>
<shortdesc id="GUID-8D7A677D-6782-4A65-96B4-F7F4B3CB5CCD">
<ph>Short description of the topic.</ph>
</shortdesc>
<prolog>
<metadata>
<category>
Content area
<keyword>Templates</keyword>
</category>
<keywords>
<indexterm id="GUID-32379B47-E4F9-4E00-A8A7-383584241D88">indexterm</indexterm>
</keywords>
</metadata>
</prolog>
<conbody>
<p id="GUID-A2466389-DC06-4052-A0EE-8684F3C3D7D3">
<ph>Text here.</ph>
</p>
</conbody>
</concept>
If I change FOOBAR TO OASIS, then it seems to work - at least it does not give any error. The
command that I'm running is:
dita -i=/app/dita/in/foobar.ditamap -f=xhtml -o=/app/dita/out
The error it gives:
[gen-list] [DOTJ079E][ERROR] File 'file:/app/dita/in/xs_help_me_contextually_please.xml' could not be loaded. Ensure that grammar files for this document type are referenced and installed properly. Cannot load file: /D:/InfoShare/Web/Author/ASP/DocTypes/dita-sdl/1.3/dtd/technicalContent/dtd/sdlConcept.dtd (No such file or directory)
[move-meta] I/O error reported by XML parser processing file:/tmp/temp20191106165059386/in/xs_help_me_contextually_please.xml: /tmp/temp20191106165059386/in/xs_help_me_contextually_please.xml (No such file or directory)
[move-meta] file:/app/dita/in/foobar.ditamap:3:327: [DOTX026W][WARN]: Unable to retrieve linktext from target: 'xs_help_me_contextually_please.xml'. Using navigation title as fallback.
Also I should add the technicalContent/dtd/sdlConcept.dtd (that I was also given) somewhere in the library but not sure where. Tried to put it in plugins/org.oasis-open.dita.v1_3 and thought it works but when removing the file and having //OAOSIS/ in the source xml, then it didn't give out any error either.
How can it all work if the path is file:///D:/InfoShare/Web/Author/ASP/Doc... that does not exist in the system where the import happens (Docker container). Is it just informational?
Very confused of all of this.
Thank you in advance!
It's hard to help you given what you have provided, but I can add some clarifying information:
You are working with DITA source that either is (or has been stored) in the SDL CCMS. Depending on the age of the SDL product it has different names: Trisoft, SDL Live Content, SDL Tridion Docs.
DITA 1.3 is backwards compatible with all previous versions of DITA, so you should not have to adjust any DITA source files. But -- if the DITA source uses different DTDs -- as any content stored in the SDL product does, you'll need those DTDs, as they are different than the OASIS DTDs that ship with DITA-OT.
Hope this helps a little; you also might have better luck posting on the dita-users list at Yahoo!
Best,
Kris
Am working on a project to build an opc ua server from specification,
I've gone far enough on the implementation, am currently working on the write request, I already have a few nodes in the server address space.
There seem to be so many nodes, so many actually. It's almost impossible to create
and add the Nodes one by one.
Anyways back to the question, I've downloaded an xml file from opc foundation containing schema for all the nodes in the address space, Here is a link to the xml file
What is the most efficient way to create nodes from the xml file ? I am writing on a c95 compiler.
Below is a quick view of how Nodes are represented in the nodeset Xml file,
<Nodes>
<Node i:type="DataTypeNode">
<NodeId>
<Identifier>i=1</Identifier>
</NodeId>
<NodeClass>DataType_64</NodeClass>
<BrowseName>
<NamespaceIndex>0</NamespaceIndex>
<Name>Boolean</Name>
</BrowseName>
<DisplayName>
<Locale></Locale>
<Text>Boolean</Text>
</DisplayName>
<Description>
<Locale></Locale>
<Text>Describes a value that is either TRUE or FALSE.</Text>
</Description>
<WriteMask>0</WriteMask>
<UserWriteMask>0</UserWriteMask>
<RolePermissions />
<UserRolePermissions />
<AccessRestrictions>0</AccessRestrictions>
<References>
<ReferenceNode>
<ReferenceTypeId>
<Identifier>i=45</Identifier>
</ReferenceTypeId>
<IsInverse>true</IsInverse>
<TargetId>
<Identifier>i=24</Identifier>
</TargetId>
</ReferenceNode>
</References>
<IsAbstract>false</IsAbstract>
<DataTypeDefinition i:nil="true" />
</Node>
Programatically filling a running OPC-UA server with nodes is unacceptably slow.
you may want to investigate the ModelCompiler.
I found it fairly straightforward to fill a modeldesign XML with data and generate code and NodeSet2.xml. So even if you have no need for the generated C# code, which I suspect to be your case, this approach may be useful.
You may also want to look at the UA-.NETStandard repository.
It offers a method LoadFromXML method that reads your nodeset pretty quickly. You may find inspiration in this method.
Bon courage et un grand merci pour vos contributions au monde OPC-UA.
Maybe I'm a bit late, but I answer if it can help someone.
If you are using C/C++ with open62541 SDK I found that it is possible to generate *.c and *.h files to include in your opcua server, as described with some examples here: you only need to run a python program, providing some parameters and the name of output files to be generated, then include these files in your opcua server.
Another way that I found is using UaModeler by Unified Automation, in that case you can generate source files to include in your project, drawing your information model in the program and exporting it to xml or source files.
We have dataset in Vertica and Tableau is querying the data (4 Billions record) from vertica for dashboard as shown below :
All list and graphs are separate worksheets in tableau and using same connection to Vertica DB. Each list is a column in DB and list is descending order of # count of items in dataset's respective column. Graph also same as list but calculated in slightly different manner. Start Date and End Date is date range for Data to be query like data connection filter which will restrict the query to fixed amount of data example past week, last month, etc.
But I get this ERROR :
Vertica][VerticaDSII] (10) An error occurred during query preparation: Multiple commands cannot be active on the same connection. Consider increasing ResultBufferSize or fetching all results before initiating another command.
Is the any workaround this issue or any better way to do this
you'll need a TDC file which specifies a particular ODBC connection string option to get around the issue.
The guidance from Vertica was to add an ODBC Connect String parameter with the value “ResultBufferSize=0“. This apparently forces the result buffer to be unlimited, preventing the error. This is simple enough to accomplish when building a connection string manually or working with a DSN, but Vertica is one of Tableau’s native connectors. So how do you tell the native connector to do something else with its connection?
Native Connections in Tableau can be customized using TDC files
“Native connectors” still connect through the vendor’s ODBC drivers, and can be customized just the same as an “Other Databases” / ODBC connection. In the TDC files themselves, “ODBC” connections are referred to as “Generic ODBC”, which is a much more accurate way to think about the difference.
The full guide to TDC customizations, with all of the options, is available here although it is pretty dense reading. One thing that isn’t provided is an example of customizing a “native connector”. The basic structure of a TDC file is this
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='genericodbc' enabled='true' version='7.7'>
<vendor name='' />
<driver name='' />
<customizations>
</customizations>
</connection-customization>
When using “Generic ODBC”, the class is “genericodbc” and then the vendor and driver name must be specified so that Tableau can know when the TDC file should be applied. It’s much simpler for a native connector — you just use the native connector name in all three places. The big list of native connector names is at the end of this article. Luckily for us, Vertica is simply referred to as “vertica”. So our Vertica TDC framework will look like:
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='vertica' enabled='true' version='7.7'>
<vendor name='vertica' />
<driver name='vertica' />
<customizations>
</customizations>
</connection-customization>
This is a good start, but we need some actual customization tags to cause anything to happen. Per the documentation, to add additional elements to the ODBC connection string, we use a tag named ‘odbc-connect-string-extras‘. This would look like
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;' />
One important thing we discovered was that all ODBC connection extras need to be in this single tag. Because we wanted to turn on load balancing in the Vertica cluster, there was a second parameter recommended: ConnectionLoadBalance=1. To get both of these parameters in place, the correct way method is
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;ConnectionLoadBalance=1;' />
There are a whole set of other customizations you can put in to place to see how they affect performance. Make sure you understand the way the customization option is worded — if it starts with ‘SUPRESS’ then giving a ‘yes’ value will turn off the feature; other times you want to set the value to ‘no’ to turn off the feature. Some of the other ones we tried were
<customization name='CAP_SUPPRESS_DISCOVERY_QUERIES' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_PREPARED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_SELECT_STAR' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_EXECUTED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPRESS_SQLSTATISTICS_API' value='yes' />
<customization name= 'CAP_CREATE_TEMP_TABLES' value='no' />
<customization name= 'CAP_SELECT_INTO' value='no' />
<customization name= 'CAP_SELECT_TOP_INTO' value='no' />
The first set were mostly about reducing the number of queries for metadata detection, while the second set tell Tableau not to use TEMP tables.
The best way to see the results of these customizations is to change the TDC file and restart Tableau Desktop Once you are satisfied with the changes, then move the TDC file to your Tableau Server and restart it.
Where to put the TDC files
Per the documentation ”
For Tableau Desktop on Windows: Documents\My Tableau Repository\Datasources
For Tableau Server: Program Files\Tableau\Tableau Server\\bin
Note: The file must be saved using a .tdc extension, but the name does not matter.”
If you are running a Tableau Server cluster, the .tdc file must be placed on every worker node in the bin folder so that the vizqlserver process can find it. I’ve also highlighted the biggest issue of all — you should edit these using a real text editor like Notepad++ or SublimeText rather than Notepad, because Notepad likes to save things with a hidden .TXT ending, and the TDC file will only be recognized if the ending is really .tdc, not .tdc.txt.
Restarting the taableau resolved my issue which was giving same error.
I've been playing with the REST functionality in Railo 4, with limited success...
I have two paths set up, the first works exactly as I'd expect but requests to the second seem to exist outside of the application...
I have two mappings:
/api and /admin/api
I created /admin/api first, I'm not sure if that makes a difference, but everything is working fine there and not in /api.
I have the following components in both:
<cffunction
name="test"
access="remote"
returntype="struct"
returnformat="json"
output="true"
httpmethod="GET"
>
<cfscript>
return Application;
</cfscript>
</cffunction>
In my application I have a bunch of variables that are created onApplicationStart - if I run this component from /admin/api they are all available to me, however running the exact same component from /api I don't have any!
I set up the /admin/api a while ago and have had it functioning with no problems - I'm wondering if I've missed a step when I set up /api...
Can anyone explain why I would have access to the application variables in one path, but not in the other?