I have a client with 5 stores set up as 5 companies in QB Enterprise. I have been asked to write a program (C# MVC) which includes the ability to transfer inventory between stores. This would require access to more than one store (QB Company).
I'm struggling with how to provide this, since the QBFC connection to a running copy of QB isn't going to let me access all the stores readily. While it would be possible to switch to access Store #2 (I think), I will have other computers still accessing store #1.
Is there a way for my program to access all 5 stores? I would like to have one site running and serving all the stores.
If I'm out to lunch (and out of luck), any better ideas?
You can access individual QB company files by specifying the company file when you first connect.
However, you can only have one open at a time and the UI can not be open to another QB company file when you are trying to do this on that system.
You can open a QB file and extract data from it and then open a different QB file and import the data. We have 2 programs that work in this manner.
Related
I have no experience with OpenText.
I have a client who hosts a large amount of documents on OpenText Content Server. They would like to do things such as:
Navigating through the file system and moving files between directories in the file system
Upload files (PDF, MS Word docs, etc.) from local machines to the cloud file system
Retrieve files from the cloud file system to be stored and opened on local machines, or read the content of files directly through the API
Changing access rights (read/write) of users on Content Server
Are these options possible through the AppWorks REST API?
Lastly, is there any reporting analytics data available about the Content Server, e.g.:
Number of uploads/downloads in a timeframe
Number of users logged in a timeframe
etc.
The first 4 items are possible out of the box with the newer versions.
The latter two would need reports to be created on the server side which in turn then be called via rest.
Preface:
I'm hoping to upgrade an existing application by adding cloud backup and syncing of the customers data. We want this to be as seamless as possible, but also for the customers only interface to the data to be via the applications front-end interface.
Our application can be connected to the oil pipe of a machine, collects data on the oil condition. When a test has completed we want to push this to the cloud. Because of the distinct test nature of the data (as opposed to one big trend) most IoT platforms don't suit very well, so we're aiming to release a slightly modified version of the application which doesn't have the connection to the sensors and this will be our remote front-end.
Since the existing application uses a relatively simple file structure to store it's data, if we simply replicate these files in the cloud, the remote front-end version can just download these to the same location and it'll work fine. Thus this has lead us to Dropbox (or any recommended more appropriate cloud storage system).
We hope to use the Dropbox API directly in our application to push and pull the files as necessary. All of this so far we believe is perfectly achievable.
Question: Is it possible - and if so how would we go about - to setup a user system with the below requirements
The users personal dropbox is not used
Dropbox is completely hidden from the user
The application vendor has a top level user who has access to all data (for analytic, we do not want to store confidential or sensitive data).
When the user logs in they only have access to their folder and any attackers could not disrupt the overall structure. (We understand that if an attacker got the master account then all is lost, but that is an internal issue to keep it secure. As long as the user accounts are isolated this is okay.)
Alternative Question Is anyone aware of a storage system or IoT system which would better suite this use case? We will still require backups/loss prevention as part of the service.
I want to upload physical files on Content with using GOS. So my BASIS consultant made some preferences for this in OAC0, OAC1 tcodes etc. But now, when I try to upload a file message appears like this:
system error when accessing knowledge provider.
Anyone have idea for this issue?
This means that your BASIS gut improperly set up knowledge provider in your system.
Binary GOS attachments are stored through SAPOffice, particularly through Knowledge Providers, and how they are stored is maintained in transaction KPRO, where certain type of content server can be specified. More info can be obtained in notes 904711 and 530297However, if you are not proficient in SAP administration it's better to assign this to your BASIS. If he is not able to conduct his work, you'd better find another one:)
I'm trying to fetch all events from entire system by using REST API to synchronize with own application. I was extracting events for every user using REST API for his own calendar file.
For example:
Fetch johndoe.nsf/api/calendar/events
Fetch jasonmartin.nsf/api/calendar/events
Fetch jeanmoore.nsf/api/calendar/events
etc.
It's working with low number of users. But I need to do it for around 2,5k users, which kills my system.
Is there any central database from I can extract this data?
I tried this with resource reservation databese, but only what I got was empty response.
No, there is no central database of calendar events. There couldn't be. Notes and Domino is a distributed environment. Information can be spread over dozens of servers.
But you could write Java or C application that runs on the Domino server and aggregates the information from all the users' calendars into one central database, and that application will probably run faster than your remote calls through the REST API. But you'll still have to make REST API calls into that central database, and the sum of the activity will be greater than what you are dealing with now.
Maybe my iCal freeware tool could help you
http://abdata.ch/publish-ibm-domino-calendar-entries-in-icalendar-format/
I want to write a C++ administrative app to simplify management of DBs I am in charge of. Currently, when I want to tell if there are users connected to multiple Firebird databases operated by 2 different instances of it, I have to connect to every single DB and check. That's ok, but I don't want to register every new database that is being created when i don't look, I want some way to list databases that are currently open or otherwise in use by the server. Current 2 uses of this functionality I can think of are:
Auto-inclusion in backup procedure
Application update, which require users to log off (one-look and I would be able to tell whom to kick or at least which department to call)
Firebird does not have an API to list all available databases. Technically Firebird simply doesn't know about the existence of a database until you actually connect to it.
You might be able to find all databases that are being connected to using the Trace API or the monitoring tables, but that does not exclude the possibility that other databases exist on your system.