Password protection of autocad dvb files - macros

I work in a construction projects company developing autocad tools, mostly with the integrated vba editor.
The company wants to keep the developed dvb files stay inside the company, or somehow make them useless when they are carried outside.
So, I know that be password protecting the created dvb files, the code can be hidden (Although after 5 min of google search I discovered that it is trivial to unlock them.) I am trying to find a way that the developed vba files will be used and executed in office, however their codes will be hidden and the employees would not be able to use them out of office.
I am not sure if this is possible though. I know that if I develop external exe files I can use several methods(Connect to local server before running, use USB stick key etc..), however I wonder if I can guarentee that the codes I wrote in the AUTOCAD VBA editor will not be seen and can not be used outside office.
Thank you for all the help in advance.
P.S: Using Autocad 2010 on Windows 7 SP1

In short, you cannot completely protect your DVB source files. As you discovered, information on breaking the password protection is readily available and trivial for a tech savvy user to do.
If your goal is to prevent users from just taking the DVB file with them and using it elsewhere (without source modification), you can embed some checks into the code which will cause failure. For example, ping your Domain Controller by name and if no response it returned, stop with an error. This, however, could be removed if someone edited the code (see first point above).
If you do need protection on your source, you don't want to go the DVB (which is VBA code) route. Instead you will want to develop a true plugin with .NET (which would require a re-write). Of course this isn't foolproof either as .NET code can be easily decompiled to source; however if you run it through a good obfuscator it would make it difficult (but still possible) for even the most dedicated to modify.
In short, there is no way fully protect your source, only make it more difficult for someone to reverse engineer.

Related

How to make Thunderbird stop during email download?

I finally moved my old Outlook Express stuff to Thunderbird, and am downloading all emails stored on POP3 server.
But I don't want Thunderbird to download them all at once. I want 20 at a time, so I can see them and delete junk ones and move remaining to their proper folders before downloading more.
I can't find a button to stop downloading. Once I make Thunderbird download emails, it just keeps downloading and the only way I found to stop it is closing it.
Is there a better way?
NOTE: while the specific counts are not possible, you can switch back and forth between offline/not offline, but it's a pain, requires confirm click.
Otherwise:
Download all emails. Use spam emails to train your thunderbird spam filters once all the emails have downloaded. Start at the beginning of the list. It's a one time operation, don't overthink it. As you assign 'junk' status to emails, you can rerun the spam filters on the inbox until it's properly trained, and most of the subsequent emails will get moved to the Junk folder automatically after you've created a decent set of spam samples. If you give the junk filter a few hundred samples it will be pretty accurate. Check junk folder for false positives, mark not junk, repeat until it's good. Much less work than doing it 20 emails at a time.
Thunderbird's builtin spam filters are quite good once properly trained.
When viewing (including preview mode) spam emails, make sure that third party image view is disabled, otherwise you will be validating your email address in the spammer email databases (they use often images or links with unique id's in them that when loaded or clicked simply validate your email address as current). Ideally disable all html viewing and view in text mode only, no html.
As with training your spam filters, the same applies to email rules/filters, which are even easier to create and run.
Create filters that move the emails to the desired folders after you make the folders, and after you create each filter, run the filters on the inbox again.
Because the user is fairly stubborn in this case, here's the basics when interacting with a new piece of software:
Check to see if the desired feature that another similar program has is supported. If it isn't, and you're sure you didn't just miss it, the developers don't care about that feature, or the feature is not technically possible due to how the software is constructed. The users in general also don't care about that feature, otherwise someone would have filed an issue, and others would have piled on over time, until the devs got tired of that issue and made it happen. In this case, we can assume none of that happened.
Check to see if the program supports extensions of some kind. Thunderbird does, so look for an extension that extends the features of the software to do what you want. If nobody wrote this extension, this means that no person capable of programming cares about that feature, in most cases, or, that the software itself doesn't support it internally.
check on a site like here to make sure you didn't miss anything, and that actually that feature does exist somewhere somehow. If your results are negative again, that's about the end of the road in the quest for that personal favorite feature.
Check to see if there is another way that sort of emulates what you want. In this case, for example, you can go offline/online though it takes a few steps each time, and certainly isn't a feature intended to do what the OP asked, but it can be used in this way crudely.
If neither 1 nor 2 show support for the desired feature, decide how important that feature was to you. If it is very important, research the core codebase and see if you can extend it via extension to support the feature, or hire someone to write the extension for you, and maintain it over years, or don't use the software.
Now, if after these checks, you find that not one single person in the world cared enough to add support for the feature, it means there's not a whole lot of demand for it, and certainly no demand from the free software developers who in the case of thunderbird, wrote it, and its extensions, and probably not from the users either, which is a lot of people globally in this case.
Once you determine that there was and is no demand for the feature, move on to something more productive, and stop complaining that the feature that apparently nobody ever was willing to take the time to implement isn't implemented, it means nobody cared enough to do it, that's all. Complaining here about that won't change that fact.
If you have decided to use the new software anyway, then adjust your workflow to the fact that the desired feature is not going to be part of your life anymore. This is also known as being an adult in some societies.
Other similar things that don't support everything that another similar software does:
gimp does not support everything photoshop does
windows does not support everything gnu/linux does
gnu/linux does not support everything that windows does
claws mail does not support everything thunderbird does
thunderbird and outlook express are different pieces of software that are similar in function but internally quite unrelated, they don't even use the same mail storage format, for example.
Mac OSX is different from GNU/Linux, FreeBSD (although internally it's got some similarities), and Windows.
Windows 98 is different from Windows 2000 which is different from Windows XP which is different from Vista which is different from Windows 7 which is different from 8 and 8.1 which is different from Windows 10.
Outlook is different from every email client known to man and woman alike. As is its mail storage format. Be very glad you are not trying to find software that is similar, you'd be out of luck, since nobody else is crazy enough to produce such a mess.
And so on. The software I write does similar things to other software, but is different, quite different, from all of them. This is why people use it and like it. If it were the same, there would be no reason to use it. If someone posts an issue asking for a feature a similar but different piece of software has, I'd think: Ok, is it a good idea? Does it meet a valid need? If so, is it hard? Is it possible at all (the other software may not even be right for example)? Is it possible within my software's framework? Did I receive a patch (always a great way to motive a developer)? If the first condition, it being a good idea, was never met, then you can forget about the rest, they are irrelevant.
I agree this isn't a great stackoverflow question.

Epicor Newbie looking for direction

I am an Epicor and Crystal Reports Newbie. I have started working with these programs a month ago, when I was hired. I am still trying to figure out how you know whether you are trying to customize a BAQ, Dashboard, etc. How to know where/when to make a new BOM report and such. If anyone out there has some tips, I would greatly appreciate it. I feel slightly intimidated by the program but am also determined to learn my way through it.
Thanks!
Toohey! Welcome to the world of Epicor!
Although I'm sure in the past couple of months you have learned the ropes, here are some extra tips to keep you moving forward:
That is not part of the system functionality
In order to keep costs under control, err on the side of not making system customizations to meet all user requests. You will quickly see that adding a quick field as a customization to a form isn't just the 5 minute change it seems like. You will soon be creating several custom reports and dashboards to report off of this field, and the cost of the change soon outweighs the benefit in many situations. As you become more familiar with this, try to balance ROI against the high cost of Epicor system customizations. It is best to lead with "that is not part of the system functionality", and when they push the issue, treat even small changes as controlled projects.
BAQ and Report Changes
Inevitably, you will need to customize the system's BAQs and Reports to meet your business needs because the standard system isn't designed exactly for your business.
Epicor has standard BAQs that start with 'z' and many reports. You should avoid editing the stock BAQs and reports, because they will be overwritten with each patch of Epicor. Instead, copy the standard distribution BAQs and rename the copies using your company initials as a prefix. Similarly, you want to create a custom reports folder separate (or within) the standard reports folder where you place all of your modified reports. You can then link the menu to the BAQ Report or Report Data Definition, and link the report style to the location of your new custom report on the server.
Customizations
Maintenance of customizations has a high long-term cost if you do not have in-house developers. A critical piece of advice here is to make sure all of the code, be it in C# or VB, is thoroughly commented. Even if you're generating code with a wizard, do yourself a favor and put a standard header into the script of every customization that includes the first date of the customization, when it was modified, and detail everything that was changed (especially if the change was a property change or a field addition that does not clearly appear in the script). Customizations have been known to fail for unexplained reasons, or create bad script that is not editable through the standard Epicor interface, and there may come a time when you have to rebuild the customization from scratch using only this change log and things you can clearly see in the form. You should save your customizations with some obvious standard naming convention (something like ORDER_ENTRY_CSR_YYMMDD), and make sure you update all menus to reflect the newest customization for the purpose you're using it. We also export our customizations for archival, just in case something should happen. Another note here is if you do not increment the customization name on a change and then update the menu items, users will still be use locally cached versions of the page until they clear their client cache. So, I always recommend incrementing. Another note on customizations and every custom exportable object in Epicor is to do yourself a favor and export them to either a source control system or a file repository so that after you deploy a faulty customization, rolling back to the previous version is quick and painless.
BPM Directives
As you're probably aware by now, BPM directives are powerful tools which can be used to update tables and prevent users from making terrible business decisions. A note on these is similar to customizations - comment comment comment!
Consultant Use
If you are using external consultants to create BPMs or Customizations, mandate distribution of commented source code that can be understood internally by one of your team members.
I hope this helps!
Source: 4 yrs experience as an Epicor ERP programmer
I would like to add that you should develop any Customization, BPM or Baq/Dashboard in the test system because any error on a solution can stop users from perform their job. Also, you can use a powerful tool called tracing options that helps you to recognize where to place the BPM directives. Further more there is a huge Epicor forum where you can post questions and a comunity of consultants , developers and users will answer your questions, and advise you about best Epicor practices, and it is completely free. You need to register on it; this is the link www.e10help.com.

PLC Version Control

I need to come up with a CM process for PLC code.
Currently, the system is developed using RSLogix 5000. The build product is a monolithic file that can be loaded onto a PLC for execution and edited directly in the development environment. With multiple developers, this has become a problem. They're stepping on each others changes.
As an analogy, it's as if, when doing Java development, the only wway to edit and save the source would be to load up a *.jar file into your IDE, make the change, and then save it back to the jar file. This is less than ideal.
How can I coordinate changes between multiple developers working with PLC's?
If we are talking about one big binary files, then a VCS (centralized or decentralized) is not the best tool for the job.
An external referencial (a shared disk for instance) where a batch will copy and label the current PCL state is better.
See "Tracking Software History"
To avert discontinuities in the historical record of revisions, old versions of programs must be stored.
“We take it a step further, though. Using our MDT AutoSave, we actually go out and interrogate the equipment. Overnight or at whatever frequency is specified, the software reads the programs in the PLCs and then compares that information to the last known program. The version-control software will copy the new program and store it and [then] compare it to the last one.
Launching version control is fairly simple. Required is software installation and then hardware configuration. “You would need a server and a couple of weeks of engineering and you’re good to go,” Perysyn says. However, his company uses a “shrink-wrap approach” that involves installing the software and then customization by users filling in the blanks.
That being said, when you have multiple changes from multiple developers, you need an integration environment where a first delivery can be done and validated, before pushing it to the actual server.
See also this post.
I use Unity Pro, so this may not apply for other brands.
Unity can export an "archive" file which is XML which describes the PLC program and IO setup in its entirety. After commissioning changes, I create an export and check it in to my local Git repo. This gets me an annotated history of changes, but no visual comparison. I can always use UnityDiff for comparison.
Check out http://www.mdtsoft.com/ also
You need specialized versioning system for PLCs like VersionDog.
From the manufacturer:
"Special support with Smart Compares for SIMATIC S5, SIMATIC S7,
SIMATIC PCS 7, WinCC, WinCC flexible, InTouch, CoDeSys, TwinCAT,
Phoenix PC WORX, RSLogix, Schneider Modsoft, Schneider Concept,
Schneider Unity, SINUMERIK 840D, Bosch IndraWorks and more. Also robot
programs from ABB and Kuka and office related data formats like
Microsoft Word, Microsoft Excel and Adobe PDF are perfectly supported
by versiondog.
Update: Here is a screenshot showing ladder version compare. I guess that's what most PLC folks are interested in. We also use it to schedule e-mail report if PLC offline and online application versions are a match, as an alarm that something has been changed in PLC but not put into version control server.
About RSLogix5000 specifically, I have seen developers use an emulated PLC and make their changes online. The final product once developed is then put together with all the comments (as they are not contained in the PLC) and then commissioned. There are issues with changes that cannot be done online, such as AOIs. There are tools in place to stop two people editing the same logic online at once and to take ownership of sections. Backups can be done in the form of uploads, but there isn't any way to track changes.
It is a messy problem, messier still for when you are maintaining a system as you want an .ACD that you can go online with, as unless you are somehow doing a diff with the RSLogix compare tool you just see unreadable machine code like "+|Éû³´¬ÙÆW×晵‚>Ù,"
The most common revision control I have seen (sadly) is just saving the the latest file, then taking a copy and adding the current date to the file name, like the recommended control.com post described.
RSLogix5000 has always prohibited multiple users from opening and editing on the same .ACD simultaneously. However, if multiple users have identical .ACD files, open them, and all make connections to the same target controller, they each can edit on the controller simultaneously, but only if they are working on different routines. Other's edits appear automatically, if they were to look at another programmers routine.
Note that working online like this is usually done with the PLC running, even sometimes with the target system (some kind of machine) operating. This kind of arrangement for the purpose of completing work faster, or in some cases because the system is huge. No one develops like this, as it is really a debug tool and impractical for significant changes.
If one programmer finishes, and another is not done, the unfinished work of the other will be saved to the first programmer's .ACD when they save. Whoever saves last will have everyone's work.
Like others have mentioned in this thread, using file date is fairly reasonable. Some companies use a version control variable that is usually displayed on a connected HMI. Other companies use a separate document that documents who and what changes. Sometimes version notes are placed in a lengthy rung comment in the main routine.
My company uses a separate change log, and dated archive copies are maintained. Multiple programmers are only used in the most extreme cases. Someone is always designated to maintain the offline file integrity, usually the person who will be working the longest, or the project manager.
It is important to note that rung comments are not carried from one user to another before RSLogix5000 v21 because previous versions didn't store comments on the controller.
All this said, you might be trying to manage offline development. I haven't seen any sophisticated methods for this. Usually programmers write the needed routines separately, and a project manager will assemble them into a single project. The cleanest approach I've seen is where a project manager will create an architecture with global functionality, and assign routine work to others, giving them a copy of the .ACD to work with. They return the .ACD with changes, and the project manager copies and pastes their routines into the "master" project.
This is a very good question and it really depends on what you want it to do.
If you are only using Rockwell equipment it might be helpfull to look at their solution, I think it's called FactoryTalk AssetCentre.
Currently I am looking into using Bazaar from Canonical.
One thing that VonC pointed out is that a piece of software that can interogate the PLC is a deffinate plus, not a must in my oppinion but it sure as hell helps.
Am I reading your question properly and you have multiple developers working on the same PLC code at the same time? It's a scary thought but I know it sometimes needs to happen, Siemens PLC's are a bit easier to program with multiple developers but I would assign one person to consolidate and test all the changes before committing to the PLC. Any CVS system will let you create branches for every developer but how you would get them to consolidate their changes is the million dolar question.
Bart.
A simple thing to do would be to do a text diff on the .l5k files so you can easily see whether a developer has been messing with part of the file that is outside of their scope.
I saw this question just now from a link at stack exchange: Are There Realistic/Useful Solutions for Source Control for Ladder Logic Programs. Rather than have a link only answer, I'll dupe my answer here:
There is actually a canned solution - from GE-IP of all places. Check out Proficy Change Management. This product does version control from a PLC control systems point of view, rather than a pure version control of files point of view - it works as a layer sitting on top of a VCS (the scary part is that originally this VCS was Visual SourceSafe) and handles rights management, reporting and checkout/checkin.
While the product is from GE-IP, it is designed to support a variety of PLC and HMI systems out of the box.
Full disclosure, I used for work for a company selling and installing PCM (but that was 7 years ago). So if you ask me what it was like back then I'm likely to tell you where it all went wrong!
In my company we just started a trial with Copia.io
Check it out. Our first tests look very promising!
It brings, branching, merging, ladder diff etc... for multiple PLC platforms (Rockwell, Siemens, Codesys)..
PS. I work for a company that builds machines, we were looking for version-dog alike solutions with a bit more power in collaboration and diffing capabilities. I used tools like Mercurial, Git, Tortoise in past companies (not for PLC though).

Deploying .EXE to network drive? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
What are the problems with deploying an .EXE to a network drive and having users execute the .EXE over the network?
The advantage is that upgrades only need to be made to the one location. What are the disadvantages?
I would instead consider creating an MSI (http://en.wikipedia.org/wiki/Windows_Installer) file for your application and a Group Policy to facilitate distribution throughout your company (http://support.microsoft.com/kb/816102).
There are a number of freeware MSI tools. Good ones that come to mind are http://www.advancedinstaller.com/ and http://wix.codeplex.com/
The EXE is one thing, but you also need to consider any DLLs and other shared resources that may be associated with the app.
Some DLLs may be shipped with the EXE - you'd have to put those on the remote drive with the EXE, which would cause additional network traffic if it needed to use them.
Other DLLs may be part of Windows, but there could be versioning issues here if your workstations have different versions of windows or even different service packs or patches but they're all running a common version of the app.
And what about licensing? Does the app's license actually allow you to install it on a network drive - many software companies are very specific about this sort of thing, so you need to really be careful if you don't want to get caught out.
In short, it sounds like a good idea to get a quick win for your deployment management, but it probably causes far more issues than it solves.
If you really want to go down this path, you maybe should consider alternatives like remote desktop (eg Citrix or Terminal Server) or something like that - there are much better ways of achieving your goals than just sticking everything on a network drive.
One problem is file locking. In a Windows environment, if a user executes the application directly from a network share, the application's files are locked. This prevents the application from being updated with a newer version if someone has left the application open.
You can go around this by disabling the network share before updating the app and then again enabling it.
If you write your application using an Object Capability Security model, as defined in Mark S. Miller's Ph.D. thesis, Robust Composition: Towards a Unified Approach to Access Control and Concurrency Control, then you will not have any security drawbacks.
On the other hand, the "disadvantage" is that you must now manage access control via the object graph. The application should only have access to whatever permissions you give it. As some have already mentioned, Windows has a basic protection policy which locks the application files and thus prevents anyone from modifying the EXE until the application instance(s) is closed.
Really, the key issue here is you have to ask yourself what authority the program and its component parts should have. If it requires local user permission, then you will either have to design around that or give the program permission.
Understanding the implications of this, and doing it well, is not an easy task.
For our program we decided against a shared exe. We thought it would be harder to support (IT needs to kill users to unlock files before updates, users wont know where the exe is on the network, share\network file permissions need to be modified by IT, etc) and that we should emulate the behavior of other programs when possible (client software is normally installed on the clients).
The main disadvantage would be the network drive being unavailable.
Then each language, which you didn't specify, the EXE is written in matters. As .NET has some security issues running from a network drive.
It depends on what the application does. My application would be a problematic over-the-network deployment because the configuration files it uses are all in the same folder as the EXE, or in a subfolder. If every user runs off of the network, they could potentially modify the configuration files and screw things up for everyone else.
Thankfully, my app is only going to be deployed on separate workstations. :)
They might not have all the files your app needs installed. If they don't, you'll need to create a setup. If they do and it works and everyone's drives are mapped correctly, you should be fine.
I run a vendor's app like this at work. They didn't design for it, but it works without an issue. I have all the shortucts pointing to the UNC path. This particular app doesn't use files in the exe directory, so file locking isn't an issue. Its also hooked up to SQL Server for the data, so the data store isn't an issue either. (Would be a major problem if the app used a local SQLite, Access, or some other file based DB.)
If your app is a .Net app, this WILL NOT work without some major modifications to each machine's security settings, which is probably bad idea anyway. If you're talking about a .Net app, you should use ClickOnce. I use it for a few apps at work, as well, and it's great, and easy to use.
The problem is there isn't a definitive answer to your question, just a bunch of "it depends" qualifications. The big issues, AFAIK, are using local files for data storage, be they text files or databases. It is awesome for updates, though, which is why the app mentioned above is run like this.
This is perfectly doable. Be sure to set the "Run from CD-ROM" (I think?) flag in the Visual Studio settings when compiling -- this prevents the image from being backed directly by the binary, so you can upgrade it while people are running it. I am not running Windows at the moment, so I can't check, but you may be able to set this flag for DLLs, too.
One problem with doing this is that if your program associates itself with files, when the network changes and computers are renamed everybody's PC starts to run like a dog. Explorer has a tendency to query these things at funny times.
Another more serious problem is that if somebody accidentally deploys a broken version, it's not just the early adopters who get stuffed!
For an easy life, personally I recommend XCOPY deployment...
For .NET applications, we have observed BadImageFormatException which we have come to believe is from network glitches (or computers loosing network connectivity at key moments, for example using WIFI) while reading the EXE or DLL files.
IMHO this is a really bad design decision. We have a third party application in our company which is designed exactly like this.
In order for the program to run properly it requires full sharing for that folder; In this case the worst part was that the program had the freaking DATABASE in the same shared folder (yeah, I was shocked too when I found out)!!! Didn't take too long till someone wiped every file that was not in use from that folder, including the database of course :)
I really recommend a client-server approach, even if you have to buy/build a smart installer with auto-update features to overcome deployment issues.

Is there any form of Version Control for LSL?

Is there any form of version control for Linden Scripting Language?
I can't see it being worth putting all the effort into programming something in Second Life if when a database goes down over there I lose all of my hard work.
Unfortunately there is no source control in-world. I would agree with giggy. I am currently moving my projects over to a Subversion (SVN) system to get them under control. Really should have done this a while ago.
There are many free & paid SVN services available on the net.
Just two free examples:
http://www.sourceforge.net
http://code.google.com
You also have the option to set one up locally so you have more control over it.
Do a search on here for 'subversion' or 'svn' to learn more about how to set one up.
[edit 5/18/09]
You added in a comment you want to backup entire objects. There are various programs to do that. One I came across in a quick Google search was: Second Inventory
I cannot recommend this or any other program as I have not used them. But that should give you a start.
[/edit]
-cb
You can use Meerkat viewer to backupt complete objects. or use some of the test programas of libopenmetaverse to backup in a text environment. I think you can backup scripts from the inventory with them.
Jon Brouchoud, an architect working in SL, developed an in-world collaborative versioning system called Wikitree. It's a visual SVN without the delta-differencing that occurs in typical source code control systems. He announced that it was being open sourced in http://archvirtual.com/2009/10/28/wiki-tree-goes-open-source/#.VQRqDeEyhzM
Check out the video in the blog post to see how it's used.
Can you save it to a file? If so then you can use just about anything, SVN, Git, VSS...
There is no good source control in game. I keep meticulous version information on the names of my scripts and I have a pile of old versions of things in folders.
I keep my source out of game for the most part and use SVN. LSLEditor is a decent app for working with the scripts and if you create a solution with objects, it can emulate alot of the in game environment. (Giving Objects, reading notecards etc.) link text
I personally keep any code snippets that I feel are worth keeping around on github.com (http://github.com/cylence/slscripts).
Git is a very good source code manager for LSL since its commits work line-by-line, unlike other SCM's such as Subversion or CVS. The reason this is so crucial is due to the fact that most Second Life scripts live in ONE FILE (since they can't call each other... grrr). So having the comparison done on the file level is not nearly as effective. Comparing line by line is perfect for LSL. With that said, it also (alike SourceForge and Google Code) allows you to make your code publicly viewable (if you so choose) and available for download in a compressed file for easier distribution.
Late reply, I know, but some things have changed in SecondLife, and some things, well, have not. Since the Third Party Viewer policy still keeps a hard wall up against saving and loading objects between viewer and system, I was thinking about another possibility so far completely overlooked: Bots!
Scripted agents, AKA Bots, have all usual avatar actions available to them. Although I have never seen one used as an object repository, there is no reason you couldn't create one. Logged in as a separate account the agent can be wherever you want automatically or by command, then collect any or all objects you are working on at set intervals or by command, and anything they have collected may be given to you or collaborators.
I won't say it's easy to script an agent, and couldn't even speak for making an extension to a scripted agent myself, but if you don't want to start from scratch there is an extensive open source framework to build on, Corrade. Other bot services don't seem to list 'object repository' among their abilities either but any that support CasperVend must already provide the ability to receive items on request.
Of course the lo-fi route, just regularly taking a copy and sending the objects to a backup avatar, may still be a simple backup solution for one user. Although that does necessitate logging in as the other account either in parallel or once every 20 or-so items to be sure they are being received and not capped by the server. This process cannot rename the items or sort them automatically like a bot may. Identically named items are listed in inventory as most recent at the top but this is a mess when working with multiples of various items.
Finally, there is a Coalesce feature for managing several items as one in inventory. This is currently not supported for sending or receiving objects, but in the absence of a bot, can make it easier to keep track of projects you don't wish to actually link as one item. (Caveat; don't rezz 'no-copy' coalesced items near 'no-build' land parcels, any that cannot be rezzed are completely lost)