To use gettext, I need to call bindtextdomain(), providing a patch to the installed .mo files. For instance:
bindtextdomain("myappname", "/opt/gnome/share/locale");
Of course, I use defines there, which are set by my autotools build files.
However, I'd like to use gettext before running "make install", because I want to use the translations in "make check" tests.
The path for bindtextdomain() would normally contain .mo files in a structure like this:
de/LC_MESSAGES/myappname.mo
fr/LC_MESSAGES/myappname.mo
Is there any easy way to create that structure of generated files in my local build, so I can just pass a local path to bindtextdomain() during "make check"?
Related
I have a project in which I use multiple python virtual environments. For each directory, I use one different virtual environment. Is it possible to configure this in order to not change environment manually each time I need to execute files in another dir?
Just to be more clear, if my workspace is like this:
dir1
file1.py
file2.py
dir2
file3.py
file4.py
I would like to link dir1 with virtual env venv1 and dir2 with venv2. This way, whenever I run file1.py or file2.py, it would automatically use venv1, and if I run file3.py or file4.py, it should use venv2.
I'm checking this link and my first thought is configuring it with a debugging launch file via the 'python' argument. The problem with this is that I should create multiple launch options and execute each python file in debug mode.
Is there any other way? Like using workspace setttings (json file) but for each subdirectory I have? Or maybe using the settings of the workspace with a custom variable that is changed based on the directory where I execute the python file?
I'm having more trouble than I'd like to admit to structure a simple project in Python to develop using Visual Studio Code.
How should I structure in my file system a project that is a simple Python package with a few modules? Just a bunch of *.py files together. My requisites are:
I must be able to step debug it in vscode.
It has a bunch of unit tests using pytest.
I can select to debug a specific test from vscode tab and it must stop in breakpoints.
pylint must not show any false positives.
The test files must be in a different directory of the main module files.
I must be able to run all the tests from the console.
The module is executed inside a virtual environment using python standard lib module venv
The code will use type hints
I may use another linter, even another test framework.
Nothing fancy, but I'm really having trouble to get it right. I want to know:
How should I organize my subdirectory: a folder with the main files and a sibling folder with the tests? Or a subfolder with the code and a subsubfolder with the tests?
Which dirs must have a init.py file?
How the tests should import the files from the module? Should I use relative imports?
Should I create a pytest.ini file?
Should I create a .env file?
What's the content of my launch.json the debugger file config in vscode?
Common dir structure:
app
__init__.py
yourappcode.py
tests (pytest looks for this)
__init__.py
test_yourunittests.py
server.py if you have one
.env
.coveragerc
README.md
Pipfile
.gitignore
pyproject.toml if you want
.vscode (helpful)
launch.json
settings.json
Or you could do one better. Ignore my structure and look at the some of famous python projects github page. Like fastAPI, Flask, asgi, aiohttp are some that I can think of right now
Also:
I think absolute imports are easier to work with compared to relative imports, I could be wrong though
vscode is able to use pytest. Make sure you have a testing extension. Vscode has a built in one im pretty sure. You can configure it to pytest and specify your test dir. You can also run your test from command line. If youre at the root, just running ‘pytest’ will recognise your tests dir if it’s named that by default. Also your actual test files need to start with prefix test_ i think.
The launch.json doesn’t need to be anything special. When you click on the settings button next to play button in the debug panel. Vscode will ask what kind of app is it. I.e If its a flask app, select python then select flask and it will auto generate a settings file which you can tweak however you want in order to get your app to run. I.e maybe you want to expose a different port or the commands to run your app are different
It sounds to me like you just need to spend a bit of time configuring vscode to your specific python needs. For example, you can use a virtualenv and linting in whichever way you want. You just need to have a settings.json file in the .vscode folder in your repo where you specify your settings. Configurations to specify python virtualenv and linting methods can be found online
Is there anyway I can just call into a define such as LIBFOO_DIRCLEAN, and just do what was implemented in the define?
Inside HOST_LIBFOO_INSTALL_CMDS, I copy files to the target directory, and would like the 'make package-dirclean' to delete what was copied into the target directory. 'make clean', would obviously do this(any many more), but that is much more than I want to do.
I see the following buildroot variables. LIBFOO_EXTRACT_CMDS, LIBFOO_CONFIGURE_CMDS, LIBFOO_BUILD_CMDS, HOST_LIBFOO_INSTALL_CMDS, LIBFOO_INSTALL_TARGET_CMDS, etc.
make foo-dirclean is a simple tool that just deletes the package build directory. In most cases, when the list of files installed by a package does not change over time (only files content changes) you can simply rebuild the package and the target directory will be rebuilt correctly.
If you want you can implement your own foo-myclean step that implements your own logic. However you must understand deleting files in the target directory is not supported by Buildroot and thus you are on your own.
SQLAPI++ has an unusual feature where you set a string to tell it where to find the ODBC shared library. In my case this is libtdsodbc.so, and my application actually links that library at build time, but at runtime this is not enough for SQLAPI++ to work.
My code is:
SAConnection conn;
conn.setOption("ODBC.LIBS") = "libtdsodbc.so";
conn.Connect("SERVER=...", "", "", SA_ODBC_Client);
ODBC.LIBS is documented like this:
Forces SQLAPI++ Library to use specified ODBC manager library.
The above code works if you set LD_LIBRARY_PATH to a directory containing libtdsodbc.so. But if you don't, Connect() fails:
libtdsodbc.so: cannot open shared object file: No such file or directory
DBMS API Library 'libtdsodbc.so' loading fails
This library is a part of DBMS client installation, not SQLAPI++
Make sure DBMS client is installed and
this required library is available for dynamic loading
Linux/Unix:
1) The directories in the user's LD_LIBRARY_PATH environment variable
2) The list of libraries cached in /etc/ld.so.cache
3) /usr/lib, followed by /lib
It works again if you set ODBC.LIBS to a full path rather than just a filename. But how can the application know which path?
My application (outside of SQLAPI++) finds libtdsodbc.so via its RUNPATH which is set at build time. This path is not a system path like /usr/lib. I'd like to have SQLAPI++ use the same library which is loaded in the application at runtime.
One idea is for the application to inspect its own RUNPATH, search for libtdsobc.so, and use that path. But this requires quite a bit of fiddly code to basically reimplement what ld.so already does.
I don't want to bake the path into the executable at build time separately from RUNPATH, because I sometimes edit RUNPATH before deployment (and then I'd need to edit two things).
Ideally I would like to tell SQLAPI++ to just use the library which is already loaded. I can figure this path out by running lsof -p PID | grep libtdsodbc.so but running shell commands from within the executable is not a good solution (and again I would rather not reimplement lsof).
You could either use dl_iterate_phdr (the link also includes a sample code which prints out lib names) or manually parse /proc/self/maps.
For instance I set the source code path as c:\code\testapp\src. Is this then available as a var I can use - for instance so I can spit out a tag file in a location relative to this, not relative to the working dir of doxygen? I think I'm looking for something like how Ant defines vars for just about everything and these can be re-used; does Doxygen have special vars for any of the config values?
I'm thinking like $PROJECT-NAME or %VERSION% or whatever...
You can use environment variables in the configuration file; the syntax is the same as in a makefile, i.e. $(VAR_NAME)
I am not sure, but I have seen people use variables as part of their build process. For example the lemon graph library uses cmake, sets a variable for the absolute file path in cmake and the doxygen config file includes variables such as #abs_top_srcdir#. Through the build process these variables are replaced with the relevant text.