Directory not notified as modified when a file is deleted - watchman

Using watchman 4.9.0 with inotify.
I am subscribed as 'subscribe', '/home/meta', 'Buffy', {"expression": ["since", "c:1517100837:2861:1:1"],
"fields": ["name", "exists", "oclock", "ctime_ns", "new", "mode"]}.
When I create a new file in the directory, I get a notification with two entries: one indicates that the directory changed and the other indicates that a new file is available.
PS:. I am talking about creating a file inside a directory inside the root dir. That is, I am monitoring /home/meta, I have a directory /home/meta/test and I create a file /home/meta/test/z. If the file is created inside the "root" dir, I don't get any notification at all about the directory.
When I delete that file, I only get a "exists:false" notification about the deleted file, but I don't see any notification about a change in the directory.
This is important because I am migrating a legacy application to watchman and I am actually observing only directory changes. When a directory changes, I do a "since" request to it to learn about the actual changes. This is not working on deletes.
Suggestions?
Thanks!.

Related

appService.zipIgnorePattern How to ignore a file in a subfolder?

I am currently publishing a python web application from VSCODE to an MS Azure App Service.
I have a subfolder that contains a rather large SQLite.db which I would like to exclude when publishing to Azure: "\instance\Output.db"
I have tried the following - see last entry to exclude Output.db but it does not exclude the file. How do I input the subfolder and .db file in this zipIgnorePattern to exclude the file?
"appService.zipIgnorePattern": [
"__pycache__{,/**}",
"*.py[cod]",
"*$py.class",
".Python{,/**}",
"build{,/**}",
"develop-eggs{,/**}",
"Output.db{,/**}"
],

Ignoring a local file is deleting the depot file

I am using smartgit with github.
I have a config.json file on my remote github depot, with hidden passwords, at the root of the app .
I need to keep a different config.json file on my local depot, with real passwords.
As long as I try to ignore config.json locally, sometimes , it is still recorded as 'modified'
Some others times, when it finally gets ignored, by right clicking/ignore, It says 1'staged' , config.json finally gets deleted from Github when pushing the commit, I don't understand why:
THis is my .gitignore file :
.DS_Store
/config.json
config.json
node_modules
/uploads
/node_module
/dist
# local env files
.env.local
.env.*.local
# Log files
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
My config.json file , with blank that I need to leave as this on Github, because Heroku needs it :
{
"localhost_db": "mongodb://localhost:27017/",
"mongoDb_atlas_db": "mongodb+srv://jose:x#cluster0-6kmcn.azure.mongodb.net/vue-starter-webpack?retryWrites=true&w=majority",
"dev": false,
"db_name": "vue-starter-webpack",
"ftp_config": {
"host": "ftpupload.net",
"user": "epiz_26763901",
"password": "x",
"secure": false
},
"node_file_path": "./tmp/files/",
"cloudinary_token": {
"cloud_name": "ddq5asuy2",
"api_key": "354237299578646",
"api_secret": "x"
},
"logs_path": "tmp/logs/logs.txt"
}
Is there any workaround ? I have tried plenty of things already. What does "staged" means ? How can I keep a different version of file on github and locally ?
EDIT : I am trying out this command, it seems to work ! :
git update-index --assume-unchanged config/database.yml
Ignore modified (but not committed) files in git?
I have a config.json file on my remote GitHub depot, with hidden passwords, at the root of the app.
That... is not a good practice. If that file (config.json) contains any sensitive information, it should not be added/committed, but explicitely ignored.
What you can commit is config.json.tpl, a template file (which is essentially what your config.json is right now)
From there, you could generate the right config.json file locally, and automatically on git clone/git checkout.
The generation script will:
search the right passwords from an external secure referential (like a vault)
replace the placeholder value in config.json.tpl to generate the right config.json
For that, do register (in a .gitattributes declaration) a content filter driver.
(image from "Customizing Git - Git Attributes", from "Pro Git book")
The smudge script will generate (automatically, on git checkout or git switch) the actual config.json file as mentioned above.
Again, the generated actual config.json file remains ignored (by the .gitignore).
See a complete example at "git smudge/clean filter between branches".

Old feature file path is used even after updating a new path

I am new to cucumber and I am automating a scenario. Initially I kept my features files in the path C:\Users\test\eclipse-workspace\Automation\src\test\resources\featureFile. Then I moved the feature files to a different path (C:\Users\test\eclipse-workspace\Automation\src\test\com\test]automation\features). I have updated the same in CucumberOptions as shown below.
#CucumberOptions(features = {
"src/test/java/com/test/automation/features/CO_Self_Service_Home_Page_Personalizations.feature" }, glue = {
"src/test/java/com/oracle/peoplesoft/HCM/StepDefinitions" })
But when I try to run the feature, I am getting the below exception stating the feature file is not found. Here the path shown in the exception is the old path. I am not sure from where it is fetched as I have updated the new path in Cucumber options. Can you please help me understand the cause of this issue.
Exception in thread "main" java.lang.IllegalArgumentException: Not a
file or directory:
C:\Users\test\eclipse-workspace\Automation\src\test\resources\featureFile\Self_Service_Home_Page_Personalizations.feature
at
cucumber.runtime.io.FileResourceIterator$FileIterator.(FileResourceIterator.java:54)
at
cucumber.runtime.io.FileResourceIterator.(FileResourceIterator.java:20)
at
cucumber.runtime.io.FileResourceIterable.iterator(FileResourceIterable.java:19)
at
cucumber.runtime.model.CucumberFeature.loadFromFeaturePath(CucumberFeature.java:103)
at
cucumber.runtime.model.CucumberFeature.load(CucumberFeature.java:54)
at
cucumber.runtime.model.CucumberFeature.load(CucumberFeature.java:34)
at
cucumber.runtime.RuntimeOptions.cucumberFeatures(RuntimeOptions.java:235)
at cucumber.runtime.Runtime.run(Runtime.java:110) at
cucumber.api.cli.Main.run(Main.java:36) at
cucumber.api.cli.Main.main(Main.java:18)
There are a couple of points you need to take care as follows :
As per Best Practices cerate the directory features which will contain the featurefile(s) strictly through your IDE only (not through other softwares Notepad or Textpad or SubLime3) as per the image below (New -> File) :
Create the featurefile i.e. CO_Self_Service_Home_Page_Personalizations.feature within features directory strictly through your IDE only.
Keep your Project Structure simple by placing the directory containing the featurefile(s) just under Project Workspace. For Featurefiles Cucumber works with directory names. So create the features directory just under your project space Automation (same hierarchy as src). So the location of the Self_Service_Home_Page_Personalizations.feature will be :
C:\Users\test\eclipse-workspace\Automation\features\Self_Service_Home_Page_Personalizations.feature
Again, as in your Class file containing #CucumberOptions you have mentioned glue = {"StepDefinitions" } ensure that the Class file containing #CucumberOptions must be in the similar hierarchy as the figure below :
So your CucumberOptions will be as follows :
#CucumberOptions(features = {"features" }, glue = {"StepDefinitions" })
Execute your Test
Note : Do not move/copy feature file(s)/directory(ies). Delete the unwanted and create a new one through your IDE only.

How do I change ownership/attributes of a single file or single folder with sbt-native-packager for an RPM?

sbt-native-packager, when making an RPM, correctly assigns file ownership as root:root for most files. I have a case where exactly one configuration file (which doesn't exist in the RPM file, although it could be added if doing so make it easier) needs to be writable by the service. The easiest way to do this is to change ownership of the conf folder itself, allowing the service to later create that file. The alternate way is to add the file to the RPM and change ownership of just that one file.
I know I can change the ownership of all configuration files with a stanza like this:
linuxPackageMappings in Rpm := {
linuxPackageMappings.value map {
case linuxPackage if linuxPackage.fileData.config equals "true" =>
val newFileData = linuxPackage.fileData.copy(
user = "newuser",
group = "newgroup"
)
linuxPackage.copy(fileData = newFileData)
case linuxPackage => linuxPackage
}
}
But: 1) That doesn't change ownership of the conf folder itself, and 2) That changes ownership of every single file that is a conf file, which I don't want to do. Anyway, no matter what I try, the spec file still has:
%dir %attr(0755,root,root) /path/to/application/conf/
The problem is that it's still owned by root:root. Here is what I tried, for changing ownership of just the one file (using the above stanza with this case):
case linuxPackage if linuxPackage.mappings.head._2 equals "/path/to/application/conf/" =>
Also (the answer may be the same), how can I change exactly one configuration file -- selected by filename -- to withConfig("noreplace") but not the rest? I've figured out that linuxPackageMappings is an instance of Seq[com.typesafe.sbt.packager.linux.LinuxPackageMapping], and each instance of LinuxPackageMapping can have one or more file mappings. I haven't figured out how to remove one mapping from a LinuxPackageMapping and create a new LinuxPackageMapping that contains just the one file. (Or there may be a better way to do this?)

opkg install error - wfopen no such file or directory

I have followed instructions to create an .ipk file, the Packages.gz and host them on a web server as a repo. I have set the opkg.conf in my other VM to point to this repo. The other VM is able to update and list the contents of repositories successfully.
But, when I try to install, I get this message. Can you please describe why I am getting this and what needs to be changed?
Collected errors:
* wfopen: /etc/repo/d1/something.py: No such file or directory
* wfopen: /etc/repo/d1/something-else.py: No such file or directory
While creating the .ipk, I had created a folder named data that had a file structure as /etc/repo/d1/ with the file something.py stored at d1 location. I zipped that folder to data.tar.gz. And, then together with control.tar.gz and 'debian-binary`, I created the .ipk.
I followed instructions from here:
http://bitsum.com/creating_ipk_packages.htm
http://www.jumpnowtek.com/yocto/Managing-a-private-opkg-repository.html
http://www.jumpnowtek.com/yocto/Using-your-build-workstation-as-a-remote-package-repository.html
It is very likely that the directory called /etc/repo/d1/ does not exist on the target system. If you create the folder manually, and try installing again, it probably will not fail. I'm not sure how to force opkg to create the empty directory by itself :/
Update:
You can solve this problem using a preinst script. Just create the missing directories on it, like this:
#!/bin/sh
mkdir -p /etc/repo/d1/
# always return 0 if success
exit 0