I wanted to use ZXing library and to check some demos first. Website says:
The following demo clients are available:
-Unity3D and Vuforia demo (demonstrates encoding of barcodes and decoding of images from a camera with Unity3D)
I downloaded it, and it just a bunch of DLL files:
How do I run it in unity? This is supposed to be some sort of sample or demo, because demos for other platforms, like windows, contain .EXE files.
You should try the whole demo project.
https://zxingnet.svn.codeplex.com/svn/trunk/Clients/UnityDemo/
I think that will make the use of the libraries much clearer.
You do not "run it" anywhere, as they are just libraries.
The purpose of a DLL is to hold code that you can reference, dlls are not executables
You can reference these libraries from within your project and then see the referrence on ZXing library ( just google for it and the Git will come up ) and you can then use methods,classes etc from within those dlls.
Related
I'm developing a C# library that will be used either as a plugin in some Unity3d projects and also used in non-Unity3d projects.
I need to use some Unity3d classes (such as UnityEngine.Matrix4x4) and standardize some functions to use valid types agnostically to whether the project is an Unity project or not.
So, as the title says: what is the right way to include Unity3d libraries in a non-Unity3d library project? Should I just include reference to the local Unity3d binaries (like UnityEngine.dll) in my project? If so, which is the right folder to look for these binaries (they appear in some different folders in the unity installation folder)?
If you are not actually running the Unity Engine for rendering or handling input it may be the case of forcing a round peg into a square hole, both technically and legally.
The most elegant solution would likely be to write your library in C# using more agnostic libraries. For example the Matrix4x4 class you mentioned has an equivalent in System.Numerics. Alongside the library all thats needed is a light wrapper converting a System.Numerics.Matrix4x4 to a UnityEngine.Matrix4x4. Sometimes the Unity devs themselves do stuff like this, for example the Unity.Mathematics.float3 struct which works better in ECS land than a standard Vector3.
FYI if you're looking for how some particular system works, check out the C# Reference GitHub Repository, for example the Matrix4x4 struct. Just be aware that 'copying and pasting' Unity source code is not allowed.
I want to be able to feed the camera frames from a webcam into Unity. I made a .NET 4.5 C# DLL using MediaFrameReader and event listeners. Here is some other user's implementation for accessing the Hololens camera frames: Hololens - Access Camera Frames.
When I import the DLL into my 2018.2.5 Unity project, it gives me the following error:
Unloading broken assembly "....", this assembly can cause crashes in the runtime
TypeLoadException: Could not find method due to a type load error
The C# plugin built successfully many times on Visual Studio. Also, I have properly set the Api Compatibility Level (in player settings) to .NET 4.x. What could be the fix to this?
All the other SO answers related to this I have already taken a look, but does not seem to help the problem. Thanks for all the help.
EDIT: https://issuetracker.unity3d.com/issues/unity-fails-to-load-net-4-dot-6-assemblies-with-typeloadexception is the most relatable post, but had no solution.
I have few possible solutions for you to explore.
DLLs issues:
Workaround:
Comment your UWP code (the part that uses the DLL), then build it in UNITY without the DLL. In the generated UWP solution, install the package from nuget or manually import the dll, then uncomment your code and finish your development. This is a short-term solution.It is going to be annoying as you re-build your solution many times and have to comment/uncomment then re-add dlls and so on.
Other possible solutions:
Failed to run reference rewriter with command error with unity error when adding a DLL to the assets folder
Your exact need
From your description, you really do not need everything in the link you referenced (Hololens - Access Camera Frames). You need much simpler version. I recently created MediaCapture solution for HoloLens as a workaround because PhotoCapture in Unity is not working in the HoloLens and everything is working without any additional DLLs. I will post for you few links to see if it may help you:
MediaCapture Unity & HoloLens: https://github.com/MSAlshair/HoloLensMediaCapture
This maybe a good start for you. You can combine it with your original reference. Use this project as starting point to make sure your project is building correct, then use the necessary code from the other resource that you posted to accomplish the task that you desire. You may need to download Unity 2018.2.12f1 because I didn't test it in 2018.2.5
MediaCapture & PhotoCapture: Hololens font camera
Good Luck!
Simple question,
I have made a dll that uses MessagePack (from nuget). Unfortunately it seems you cannot just drop the MessagePack dll directly into unity as it generates a bunch of errors. I was trying to find the best way to add message pack binaries to unity when I noticed that there is a unitypackage file that adds all the source code to the unity project. This, however is useless to me as my external library (my dll) contains a reference to MessagePack.
In the documentation, the author suggests using a shared project for the messages. If I use the attributes in the MessagePack namespace in my external library, then I need to have a reference to MessagePack.dll. Dropping the dll into my unity project generates all kinds of errors... Sort of a catch 22...
Any idea how I should go about this?
I am researching on Abode's RMSDK for iOS. I ve the SDK with all the library files. There is a sample project given in that SDK which works fine. But If I include the lib file in my own sample app its is throwing a linker error. Can any one help me out in this. Is there any sample code that i can refer regarding this SDK.
Thanks in advance
You can try to add reference to a rmservice project insteed of using lib file.
Then, you must verify all paths described in project settings is correct. I've had to placed my project in same folder of rmservice so that I don't have to change any setting.
If you still want to use it as a lib, you should perform two actions:
1) Add the lib in the build phases, on the Link Binary with libraries area. Be sure to select your target on the left side, not the project.
2) In your build settings on the Library Search Paths you should write down the path where the library resides.
For iOS, if you are doing RMSDK11, then follow the book2png in samples folder, this will allow you build you own application.
If you are doing RMSDK10, then you probably will run into 32/64 bit support problem.
I am trying to create an app that will allow me to stream video FROM the iPhone TO a server. my current theory as to how to do this is to create a series of FFMpeg files and send them to the server. as far as i can tell i have compiled the FFMpeg library correctly for the iPhone.
i followed these instructions here. a series of executable files appeared in the folder so i'm assuming it worked.
my question is now what? how do i get these into an app? how do i make calls to these executable files? and most importantly will this even work the way i want it to?
You have built the ffmpeg binary which can run on an iPhone. You cannot run executables from an app on a (non-jailbroken) phone. So you would have to compile the library, and link against that. Then, from your app, call the relevant functions directly, mimicing what the ffmpeg program does.
Althought this issue is quite old this could help to other users in the future:
Just take a look at the source code here http://dev.wunderground.com/support/wunderradio/wunderradio.1.9lgpl.zip
Good luck
Ok you said that you succesfully compiled FFMpeg for iPhone, right? As mvds said, you can't use them as executable files. So in order to use this libraries after compilation finishes you need to copy all the .a libs generated to your project (as when you add libraries or other frameworks). These libraries are:
libavcodec.a
libavfilter.a
libavutil.a
libswscale.a
libavdevice.a
libavformat.a
libswresample.a
Then you have to configure your project
Clic on your project -> build settings
Search "Header Search Paths" and add the folder location of your libraries (location can be absolute or relative)
Clic on build phases -> Link Binary With Libraries -> Add other, and add all .a files
Voila! Now you can import and use the libraries of FFmpeg for your project
#include <avcodec.h>
#include ...
// More C and/or Objective-C Code
To access individual uncompressed frames you can use captureOutput:didOutputSampleBuffer:fromConnection: delegate method from AVCaptureVideoDataOutput (there are plenty of examples), and somehow encoding them to h264 maybe using AVFrame? As far as I know FFmpeg also can stream using RTSP for live streaming, but it seems that documentation is close to zero :(
To answer your final question
and most importantly will this even work the way i want it to?
The answer is yes, it can work, I found 2 libraries that do just what you want to achieve
http://www.foxitsolutions.com/iphone_h264_sdk.html
http://ios-rtmp-library.com
Both uses FFmpeg the same way you are suggesting, this question is a little old but I have found many users trying to achieve this so so I have a question Do you had success on doing this? Can you share your experience or recomendations?