In the 3D Modeling Kit GitHub code, Motion Capture feature getting crash.
https://github.com/HMS-Core/hms-3d-modeling-demo
Step 1 : Run the GitHub project
Step 2 : Click on Motion Capture
Step 3 : Click on either Live or Video or Photo option
java.lang.UnsatisfiedLinkError: No implementation found for int com.huawei.hms.motioncapturesdk.SkeletonDetectionJNI.createInstance(android.content.res.AssetManager, java.lang.String, java.lang.String, java.lang.String, java.lang.String) (tried Java_com_huawei_hms_motioncapturesdk_SkeletonDetectionJNI_createInstance and Java_com_huawei_hms_motioncapturesdk_SkeletonDetectionJNI_createInstance__Landroid_content_res_AssetManager_2Ljava_lang_String_2Ljava_lang_String_2Ljava_lang_String_2Ljava_lang_String_2)
at com.huawei.hms.motioncapturesdk.SkeletonDetectionJNI.createInstance(Native Method)
at com.huawei.hms.motioncapturesdk.a.initialize(ImageSkeletonImpl.java:12)
at com.huawei.hms.motioncapturesdk.t.b.a(RemoteOnDeviceSkeleton.java:14)
at com.huawei.hms.motioncapturesdk.Modeling3dMotionCaptureEngine.create(Modeling3dMotionCaptureEngine.java:13)
at com.huawei.hms.motioncapturesdk.Modeling3dMotionCaptureEngineFactory.getMotionCaptureEngine(Modeling3dMotionCaptureEngineFactory.java:1)
at com.huawei.hms.modeling3d.utils.skeleton.LocalSkeletonProcessor.<init>(LocalSkeletonProcessor.java:67)
at com.huawei.hms.modeling3d.ui.activity.SelectSourceVideoActivity.onCreate(SelectSourceVideoActivity.java:101)```
Motion Capture only supports armv8 architecture devices. This capability requires a 64-bit ARM-based Android phone.
For details , refer to link:
Refer this link for ARMv8 architecture device list.
Related
I am looking for a way to record microphone input from a specific channel.
For example, I want to record separately left/right of an M-Track audio interface or SingStar wireless microphones.
Microphone.Start seems limited in this regard.
Further, I found this thread, which says
Yes, you can assume the microphone at runtime will always be 1 channel.
My questions:
Any workaround to get this working?
It there maybe an open source lib or low level microphone API in Unity?
Is it really not possible to record different microphone channels into different AudioClips in Unity?
I am looking for a solution at least on desktop platform, i.e., Windows, MacOS, Linux.
Bonus question: Is recording specific microphone channels working in Unreal Engine?
I want to make an app like this video Hand Segmentation + Vuforia Augmented Reality
The description of video tell there was two SDK could help: Hand Recognition with Intel Perceptual Computing SDK & Augmented Reality with Qualcomm Vuforia
The problem is I don't know which feature of Vuforia could support and combine with Intel Perceptual Computing SDK, is that Object Recognition and how? Any detail instruction would be appreciate.
I do not know this intel SDK, but I can tell you how basically this should be done. You need to have Vuforia as the 'main' app, and let them take over the camera, as done in all of their samples, and then take the image (with unity, it is shown here) and pass it to another to another service to handle the hand detection. It has nothing to do with what feature of Vuforia you want to use.
I created an AR application on unity3d using vuforia extension. It runs on unity editor and Samsung Galaxy S3. But when I run it on Asus TF300TG tablet, I don't see 3d object. It don't find marker region. But why? What should I do to find it?
My settings :
EDIT :
I get an error message. Unity says that
Failed to get device caps (0xc0110001) .
Well, from your question, it is unclear of the steps you've taken to get your output. But here's an answer that explains the complete guide to getting a simple application working using Vuforia's Unity Extension.
But specifically, make sure with the AR Camera you've set Data Set Load Behavior. Under that you SHOULD see the name of the .unitypakage you imported. Check the name as well as make it Active.
Also, regarding the Image Target Behavior should be Predefined and the package name and Image name must be selected.
For more details please refer the link above. Good luck. :)
We are working on a prototyp application using unity3d. Your goal is to create a fluid and fun to use cross platform app.
The problem we facing right now is streaming (h.264 - mp4) video content over the web. This will be a major feature of our app.
I have already tried MovieTextures and the www class but it seems the files must be in ogg format which we can not provide. On the other hand handheld.playfullscreenmovie seems to be an android and ios only feature which uses the build in video player. This would be great if it would be supported on other platforms (e.g. Win8-Phone) as well.
Is there another cross platform option to stream (h.264 - mp4) video content over the web and display in full screen or as gui object? Or are there any plans to support something like this in the near future? Or is there a stable plugin for such a task?
Thanks
As of Unity 5 Handheld.PlayFullScreenMovie supports Windows Phone and Windows Store as per http://docs.unity3d.com/ScriptReference/Handheld.PlayFullScreenMovie.html
On Windows Phone 8, Handheld.PlayFullScreenMovie internally uses Microsoft Media Foundation for movie playback. On this platform, calling Handheld.PlayFullScreenMovie with full or minimal control mode is not supported.
On Windows Store Apps and Windows Phone 8.1, Handheld.PlayFullScreenMovie internally uses XAML MediaElement control.
On Windows Phone and Windows Store Apps, there generally isn't movie resolution or bitrate limit, however, higher resolution or bitrate movies will consume more memory for decoding. Weaker devices will also start skipping frames much sooner at extremely high resolutions. For example, Nokia Lumia 620 can only play videos smoothly up to 1920x1080. For these platforms, you can find list of supported formats here: Supported audio and video formats on Windows Store
mp4 is not a streamable container. If you read the ISO specification, you will see that MP4 can not be streamed. This is because the MOOV atom can not be written until all frames are know and accounted for. This 100% incompatible for live video. There are supersets of MP4 used in DASH that make this possible. Essentially, they create a little mp4 (called a fragment) file every couple seconds. Alternatively you can use a container designed for streaming such as FLV or TS.
You will probably need to step outside the unity sdk a bit to enable this.
I had heard that Google TV V3 supports the ability to add custom codecs implemented in Java, however their appears to be no published public API around this. Are there any public examples. The reason I'm asking is that I'm working on a media player and have some requests to allow playback of FLAC files and have found a java library that will decode these files appropriately.
If not any time frame for making such information available?
They should be available within 3 weeks. That would be for the latest OTA to ARM based devices.