How to get coordinates of joints in Kinect? - coordinates

I am learning about Kinect and I have a problem. I want to write to a text file the coordinate of joints of the skeleton but i don't know how to do that. Can anybody help me ?

If you are using the SDK:
using System.IO;
StreamWriter writer = new StreamWriter(#path);
int frames = 0;
...
void AllFramesReady(object sender, AllFramesReadyEventArgs e)
{
frames++;
using (SkeletonFrame sFrame = e.OpenSkeletonFrameData())
{
if (sFrame == null)
return;
skeletonFrame.CopySkeletonDataTo(skeletons);
Skeleton skeleton = (from s in skeletons
where s.TrackingState == SkeletonTrackingState.Tracked
select s);
if (skeleton == null)
return;
if (skeleton.TrackingState == SkeletonTrackingState.Tracked)
{
writer.Write("{0} {1}#", frames, timestamp);//I dont know how you want to do this
foreach (Joint joint in skeleton.Joints)
{
writer.Write(joint.Position.X + "," + joint.Position.Y + "," joint.Position.Z + ",");
}
writer.Write(Environment.NewLine);
}
}
}
This takes the joints of a detected skeleton, and writes them to a file without specifying their relative joint, but it is in the default order of the joint types. Note this is the distance from the kinect sensor.

Thank you, but it has one problem with 'skeletons'in this line:
Skeleton skeleton = (from s in skeletons where s.TrackingState == SkeletonTrackingState.Tracked select s);
System.IO.FileStream fs = new System.IO.FileStream(#"F:\Kinect Install\SkeletonBasics-WPF\Coordinates.txt", FileMode.Append, FileAccess.Write, FileShare.None);
int frame = 0;
Skeleton[] skeletons = new Skeleton[0];
public void AllFramesReady(object sender, AllFramesReadyEventArgs e)
{
StreamWriter sw = new StreamWriter(fs);
frame++;
using (SkeletonFrame sFrame = e.OpenSkeletonFrame())
{
if (sFrame == null) return;
sFrame.CopySkeletonDataTo(skeletons);
Skeleton skeleton = (from s in skeletons where s.TrackingState == SkeletonTrackingState.Tracked select s);
if (skeleton == null)
return;
if (skeleton.TrackingState == SkeletonTrackingState.Tracked)
{
foreach (Joint joint in skeleton.Joints)
{
sw.WriteLine(joint.Position.X + "," + joint.Position.Y + "," + joint.Position.Z + ",");
}
//writer.Write(Environment.NewLine);
sw.Flush();
sw.Close();
}
}
}

Related

UnityWebRequestAssetBundle.GetAssetBundle doesn't work after changing scene for the same url

I am using the following to download assetbundle and it works in first scene. But when in second scene, the same code doesn't work - the uwr.downloadedbytes return 0. If I restart the app and go straight to the second scene, it works. Strange things is when I go back to the first scene from second scene, the code works as well. I want to know what is going wrong, is it something to do with unload(false) in the first scene?
private IEnumerator DownloadBundles()
{
AssetBundleList = new List<string>();
m_AssetBundle = new List<AssetBundle>();
m_InstantiatedModels = new List<GameObject>();
yield return StartCoroutine(DownloadBundle(1));
}
private IEnumerator DownloadBundle(int i)
{
string platform = "Android";
string bundleURL = data.url + data.id + "/" + platform + "/" + data.id + "assetbundles" + i.ToString();
using (UnityWebRequest uwr = UnityWebRequestAssetBundle.GetAssetBundle(bundleURL, (uint)data.assetBundleVersion, 0))
{
AsyncOperation asyncOp = uwr.SendWebRequest();
while (!asyncOp.isDone)
{
if (m_ProgressText.gameObject.activeSelf)
{
m_ProgressText.text = "Loading " + (i == 1 ? "" : "More ") + "Models... " + ((int)(asyncOp.progress * 100)).ToString() + "%";
}
yield return null;
}
if (uwr.error != null)
{
throw new UnityException("AssetBundle DownloadHandler had an error: " + uwr.error);
}
else
{
Debug.Log(uwr.downloadedBytes.ToString());
AssetBundle bundle = DownloadHandlerAssetBundle.GetContent(uwr);
string[] assetBundleList1 = bundle.GetAllAssetNames();
assetBundleDownloaded.Add(i);
object[] golist = bundle.LoadAllAssets();
for (int k = 0; k < golist.Length; k++)
{
GameObject go = Instantiate(golist[k] as GameObject, Vector3.zero, Quaternion.identity);
go.SetActive(false);
m_InstantiatedModels.Add(go);
}
m_AssetBundle.Add(bundle);
yield return null;
}
}
}

Unity Path Generation On 2D Grid

roadTopStartX = Random.Range(5, 10); // Path's x position which creation begins
roadTopStartY = Random.Range(8, 12); // Path's y position which creation begins
roadTopLength = Random.Range(4, 9); // Path's length
for (int i = 0; i < roadTopLength; i++)
{
GameObject tile = GameObject.Find("Tile" + (roadTopStartX + i) + " " + roadTopStartY); //I created grid whose tile's name Tile X Y like Tile 0 0
GameObject road = Instantiate(roadPrefab, tile.transform.position, tile.transform.rotation);
road.name = "Road"+ " " + (roadTopStartX + i) + " " + roadTopStartY;
roads.Add(road);
}
It is how i create random path on 2d grid, do you know better solution because when thinks become more complex gameobject.find becomes suffer for me
i found a more reliable solution: (thanks to derHugo)
public GameObject findTile(int x,int y)
{
GameObject findTilex = GameObject.Find("Tile" + (tileX + x) + " " +
(tileY + y));
return findTilex;
}
Example how i get neighbour tiles
public void getNeighbours()
{
if (findTile(0, 1) != null)
{
upper = findTile(0, 1);
}
if (findTile(0, -1) != null)
{
below = findTile(0, -1);
}
if (findTile(1, 0) != null)
{
right = findTile(1, 0);
}
if (findTile(-1, 0) != null)
{
left = findTile(-1, 0);
}
if (findTile(1, 1) != null)
{
rightCrossTop = findTile(1, 1);
}
if (findTile(1, -1) != null)
{
rightCrossUnder = findTile(1, -1);
}
if (findTile(-1, 1) != null)
{
leftCrossTop = findTile(-1, 1);
}
if (findTile(-1, -1) != null)
{
leftCrossBottom = findTile(-1, -1);
}
}

Merge textures at Runtime

Is there any way to "bake" one texture to another, except for using SetPixels()?
Now i'm trying to use something like that, but it too slow:
public static Texture2D CombineTextures(Texture2D aBaseTexture, Texture2D aToCopyTexture, int x, int y)
{
int aWidth = aBaseTexture.width;
int aHeight = aBaseTexture.height;
int bWidth = aToCopyTexture.width;
int bHeight = aToCopyTexture.height;
Texture2D aReturnTexture = new Texture2D(aWidth, aHeight, TextureFormat.RGBA32, false);
Color[] aBaseTexturePixels = aBaseTexture.GetPixels();
Color[] aCopyTexturePixels = aToCopyTexture.GetPixels();
int aPixelLength = aBaseTexturePixels.Length;
for(int y1 = y, y2 = 0; y1 < aHeight && y2 < bHeight ; y1++, y2++)
{
for(int x1 = x, x2 = 0 ; x1 < aWidth && x2 < bWidth; x1++, x2++)
{
aBaseTexturePixels[x1 + y1*aWidth] = Color.Lerp(aBaseTexturePixels[x1 + y1*aWidth], aCopyTexturePixels[x2 + y2*bWidth], aCopyTexturePixels[x2 + y2*bWidth].a);
}
}
aReturnTexture.SetPixels(aBaseTexturePixels);
aReturnTexture.Apply(false);
return aReturnTexture;
}
The problem is, that i need to display a lot of sprites on 2d surface (blood, enemy corpses, etc.), and just instantiating every sprite will greatly reduce fps.
If you are concerned about fps drop when instantiating prefabs you should definitely build a Object pooling system. So you will have a system that:
Instantiating all objects in the pool and keep it far away from the main camera
Once you need the object you will "borrow" it from the pool
Once object is not needed anymore you will return it back to the object pool (for example when sprite is out the camera view
Baking it all to one texture isn't the best practice. You will need huge amounts of RAM for this. Consider steps above, its very common practice
Good example here:
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System;
using System.Linq;
public class BackgroundPool : MonoBehaviour
{
public static BackgroundPool instance;
public List<BackgroundSection> sectionsLibrary = new List<BackgroundSection>();
public int poolSize = 4;
public List<BackgroundSection> pool = new List<BackgroundSection>();
void Awake()
{
instance = this;
DateTime startGenTime = DateTime.Now;
//generateSectionsPool
for (int i=0; i<sectionsLibrary.Count; i++)
{
for (int j=0; j<poolSize; j++)
{
if (j == 0)
{
sectionsLibrary[i].positionInPool = sectionsLibrary[i].transform.position;
pool.Add(sectionsLibrary[i]);
}
else
{
BackgroundSection section = (BackgroundSection)Instantiate(sectionsLibrary[i]);
section.transform.parent = this.transform;
section.transform.position = new Vector3((-(ExtensionMethods.GetBounds(sectionsLibrary[i].gameObject).extents.x * 2) * j) + sectionsLibrary[i].transform.position.x,
sectionsLibrary[i].transform.position.y);
section.transform.localEulerAngles = Vector3.zero;
section.gameObject.name = sectionsLibrary[i].gameObject.name + ":" + j.ToString();
section.positionInPool = section.transform.position;
pool.Add(section);
}
}
}
Debug.Log("Background Pool generated in: " + (DateTime.Now - startGenTime).TotalSeconds.ToString() + " s");
}
public BackgroundSection GetPiece(Scenery scenery, SceneryLayer _layer)
{
List<BackgroundSection> allScenery = new List<BackgroundSection>();
foreach (BackgroundSection section in pool) { if (section.scenery == scenery) allScenery.Add(section); }
List<BackgroundSection> matchingPieces = new List<BackgroundSection>();
foreach (BackgroundSection section in allScenery) { if (section.sceneryLayer == _layer) matchingPieces.Add(section); }
if (matchingPieces.Count > 0)
{
BackgroundSection pickedSection = matchingPieces[UnityEngine.Random.Range(0,matchingPieces.Count-1)];
pool.Remove(pickedSection);
return pickedSection;
}
else
{
Debug.LogError("Cann't get background piece matching criteria, scenery: " + scenery + ", layer" + _layer);
return null;
}
}
public void ReturnPiece(BackgroundSection section)
{
pool.Add(section);
section.transform.parent = this.transform;
section.transform.position = section.positionInPool;
}
}

FMOD pcmreadcallback never is called while playing audio from a microphone source

I'm writing a Unity class to capture and playback audio data from a microphone. Playback part works fine I can hear my voice in the headphones but I cannot access audio samples because pcmsetposcallback is never called during playback. It is called only once inside createSound method. I think i'm missing some setting , also tried several OR combinations for FMOD.MODE flag but with no luck.
I'm using fmodstudio10510.unitypackage and testing under windows 7 but it should have fully croos-platform support.
Thanks in advance.
Walter
public class AudioInit : MonoBehaviour {
FMOD.System lowlevel = null;
FMOD.Sound snd = null;
// callbacks delegates
FMOD.SOUND_PCMREADCALLBACK pcmreadcallbackPtr = new FMOD.SOUND_PCMREADCALLBACK (pcmreadcallbackFunc);
int driverId;
void Start () {
int channels = 1;
int sampleRate = 8000;
float recordTime = 1.0f;
// get low level instance
FMOD_StudioSystem.instance.System.getLowLevelSystem(out lowlevel);
// fill sound info struct
FMOD.CREATESOUNDEXINFO soundInfo = new FMOD.CREATESOUNDEXINFO ();
soundInfo.cbsize = System.Runtime.InteropServices.Marshal.SizeOf (typeof(FMOD.CREATESOUNDEXINFO));
soundInfo.length = (uint)(sampleRate * channels * sizeof(byte) * recordTime);
soundInfo.numchannels = channels;
soundInfo.defaultfrequency = sampleRate;
soundInfo.format = FMOD.SOUND_FORMAT.PCM8;
soundInfo.pcmreadcallback = pcmreadcallbackPtr;
soundInfo.pcmsetposcallback = pcmsetposcallbackPtr;
soundInfo.dlsname = IntPtr.Zero;
// FMODE MODE flag
FMOD.MODE mode = FMOD.MODE.OPENUSER | FMOD.MODE.LOOP_NORMAL;
// create sound
FMOD.RESULT res = lowlevel.createSound((string)null, mode, ref soundInfo, out snd);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR snd " + res.ToString ());
return;
}
// get driver
res = lowlevel.getDriver (out driverId);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR getDriver " + res.ToString ());
return;
}
// start record from microphone
res = lowlevel.recordStart (driverId, snd, true);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR recordStart " + res.ToString ());
return;
}
uint pos = 0;
uint tries = 10;
// wait for a valid record position
while ( !(pos > 0) && (tries--) > 0 ) {
if ( lowlevel.getRecordPosition(driverId, out pos) == FMOD.RESULT.OK ){
System.Threading.Thread.Sleep(100);
} else { break; }
}
if ( !( pos > 0 )) {
Debug.Log ("ERROR invalid record position");
return;
}
// start playback
FMOD.Channel chn;
res = lowlevel.playSound (snd, new FMOD.ChannelGroup (IntPtr.Zero), false, out chn);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR recordStart " + res.ToString ());
return;
}
}
// only called once during lowlevel.createSound execution
static FMOD.RESULT pcmreadcallbackFunc (IntPtr sound, IntPtr data, uint len){
Debug.Log("pcmreadcallback sample size " + len.ToString());
return FMOD.RESULT.OK;
}
// Update is called once per frame
void Update () {
}
}
The recording system doesn't go via pcmreadcallback, that's why you aren't getting those callbacks.
To access the microphone data use Sound::lock and Sound::unlock.

Google Maps output=kml broken?

All
I was using the google maps KML output in my iPhone app.
If I type the following on my browser, it used to give an option to save the kml file:
http://maps.google.com/maps?q=restaurant&mrt=yp&num=10&sll=37.786945,-122.406013&radius=5&output=kml
But all of a sudden today, it is returning an html file. What happened? any ideas?
I use it in my iPhone app and it is throwing error as it is not a valid xml returned. Obviously....
Thanks,
mbh
This way of extracting the Google Directions from Google by parsing the KML file is no longer available since 27 July 2012 (because Google has changed the structure of retrieving Google Directions, now you can only get it by JSON or XML), it is time to migrate your code to JSON instead of KML.
See the answer (for Android only but maybe for iPhone can understand the algorithm and apply it) in my own question here.
Google changed something and now shows major turns only. But when using JSON, it's showing the path correctly:
public class DrivingDirectionActivity extends MapActivity {
Point p1 = new Point();
Point p2 = new Point();
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
MapView mapView = (MapView) findViewById(R.id.map);
// setting a default value
double src_lat = 18.5535;
double src_long = 73.7966;
double dest_lat = 18.5535;
double dest_long = 73.7966;
Geocoder coder = new Geocoder(getApplicationContext(),
Locale.getDefault());
List<Address> address_src = null;
List<Address> address_dest = null;
try {
address_src = coder
.getFromLocationName(
"Deepmala Housing Complex, Pimple Saudagar, Pimpri Chinchwad",
1);
if (address_src.size() > 0) {
Address loc = address_src.get(0);
src_lat = loc.getLatitude();
src_long = loc.getLongitude();
}
} catch (IOException e) { // TODO Auto-generated catch block
e.printStackTrace();
}
try {
address_dest = coder.getFromLocationName(
"Infosys Phase 2, Hinjewadi Phase II, Hinjewadi", 1);
if (address_dest.size() > 0) {
Address loc = address_dest.get(0);
dest_lat = loc.getLatitude();
dest_long = loc.getLongitude();
}
} catch (IOException e) { // TODO Auto-generated catch block
e.printStackTrace();
}
mapView.setBuiltInZoomControls(true);
GeoPoint srcGeoPoint = new GeoPoint((int) (src_lat * 1E6),
(int) (src_long * 1E6));
GeoPoint destGeoPoint = new GeoPoint((int) (dest_lat * 1E6),
(int) (dest_long * 1E6));
DrawPath(srcGeoPoint, destGeoPoint, Color.GREEN, mapView);
mapView.getController().animateTo(srcGeoPoint);
mapView.getController().setZoom(13);
}
protected boolean isRouteDisplayed() {
// TODO Auto-generated method stub
return false;
}
private void DrawPath(GeoPoint src, GeoPoint dest, int color,
MapView mMapView) {
// connect to map web service
HttpClient httpclient = new DefaultHttpClient();
HttpPost httppost = new HttpPost(makeUrl(src, dest));
HttpResponse response;
try {
response = httpclient.execute(httppost);
HttpEntity entity = response.getEntity();
InputStream is = null;
is = entity.getContent();
BufferedReader reader = new BufferedReader(new InputStreamReader(
is, "iso-8859-1"), 8);
StringBuilder sb = new StringBuilder();
sb.append(reader.readLine() + "\n");
String line = "0";
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
is.close();
reader.close();
String result = sb.toString();
JSONObject jsonObject = new JSONObject(result);
JSONArray routeArray = jsonObject.getJSONArray("routes");
JSONObject routes = routeArray.getJSONObject(0);
JSONObject overviewPolylines = routes
.getJSONObject("overview_polyline");
String encodedString = overviewPolylines.getString("points");
List<GeoPoint> pointToDraw = decodePoly(encodedString);
mMapView.getOverlays().add(new MyOverLay(pointToDraw));
} catch (ClientProtocolException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
// TODO: handle exception
}
}
private List<GeoPoint> decodePoly(String encoded) {
List<GeoPoint> poly = new ArrayList<GeoPoint>();
int index = 0, len = encoded.length();
int lat = 0, lng = 0;
while (index < len) {
int b, shift = 0, result = 0;
do {
b = encoded.charAt(index++) - 63;
result |= (b & 0x1f) << shift;
shift += 5;
} while (b >= 0x20);
int dlat = ((result & 1) != 0 ? ~(result >> 1) : (result >> 1));
lat += dlat;
shift = 0;
result = 0;
do {
b = encoded.charAt(index++) - 63;
result |= (b & 0x1f) << shift;
shift += 5;
} while (b >= 0x20);
int dlng = ((result & 1) != 0 ? ~(result >> 1) : (result >> 1));
lng += dlng;
GeoPoint p = new GeoPoint((int) (((double) lat / 1E5) * 1E6),
(int) (((double) lng / 1E5) * 1E6));
poly.add(p);
}
return poly;
}
private String makeUrl(GeoPoint src, GeoPoint dest) {
// TODO Auto-generated method stub
StringBuilder urlString = new StringBuilder();
urlString.append("http://maps.googleapis.com/maps/api/directions/json");
urlString.append("?origin=");// from
urlString.append(Double.toString((double) src.getLatitudeE6() / 1.0E6));
urlString.append(",");
urlString
.append(Double.toString((double) src.getLongitudeE6() / 1.0E6));
urlString.append("&destination=");// to
urlString
.append(Double.toString((double) dest.getLatitudeE6() / 1.0E6));
urlString.append(",");
urlString
.append(Double.toString((double) dest.getLongitudeE6() / 1.0E6));
urlString.append("&sensor=false");
Log.d("xxx", "URL=" + urlString.toString());
return urlString.toString();
}
class MyOverLay extends Overlay {
private int pathColor;
private final List<GeoPoint> points;
private boolean drawStartEnd;
public MyOverLay(List<GeoPoint> pointToDraw) {
// TODO Auto-generated constructor stub
this(pointToDraw, Color.GREEN, true);
}
public MyOverLay(List<GeoPoint> points, int pathColor,
boolean drawStartEnd) {
this.points = points;
this.pathColor = pathColor;
this.drawStartEnd = drawStartEnd;
}
private void drawOval(Canvas canvas, Paint paint, Point point) {
Paint ovalPaint = new Paint(paint);
ovalPaint.setStyle(Paint.Style.FILL_AND_STROKE);
ovalPaint.setStrokeWidth(2);
ovalPaint.setColor(Color.BLUE);
int _radius = 6;
RectF oval = new RectF(point.x - _radius, point.y - _radius,
point.x + _radius, point.y + _radius);
canvas.drawOval(oval, ovalPaint);
}
public boolean draw(Canvas canvas, MapView mapView, boolean shadow,
long when) {
Projection projection = mapView.getProjection();
if (shadow == false && points != null) {
Point startPoint = null, endPoint = null;
Path path = new Path();
// We are creating the path
for (int i = 0; i < points.size(); i++) {
GeoPoint gPointA = points.get(i);
Point pointA = new Point();
projection.toPixels(gPointA, pointA);
if (i == 0) { // This is the start point
startPoint = pointA;
path.moveTo(pointA.x, pointA.y);
} else {
if (i == points.size() - 1)// This is the end point
endPoint = pointA;
path.lineTo(pointA.x, pointA.y);
}
}
Paint paint = new Paint();
paint.setAntiAlias(true);
paint.setColor(pathColor);
paint.setStyle(Paint.Style.STROKE);
paint.setStrokeWidth(5);
paint.setAlpha(90);
if (getDrawStartEnd()) {
if (startPoint != null) {
drawOval(canvas, paint, startPoint);
}
if (endPoint != null) {
drawOval(canvas, paint, endPoint);
}
}
if (!path.isEmpty())
canvas.drawPath(path, paint);
}
return super.draw(canvas, mapView, shadow, when);
}
public boolean getDrawStartEnd() {
return drawStartEnd;
}
public void setDrawStartEnd(boolean markStartEnd) {
drawStartEnd = markStartEnd;
}
}
}
I wish Google had not stopped supporting their documented KML without notice.
I migrated my code to google places API now using their xml output.
https://developers.google.com/places/documentation/
-well i just edited my answer-
lets face it, google has change their sistem, we have to follow them
so lets use JSON or XML
:)
--edited part two--
i just found the best solution, it use JSON and parse into polyline, so we can do it!
Google Maps API Version difference
I found the way to get KML outputs as before using standard google maps links for routes.
It seems that google analyze the referrer for such links and if it is https://code.google.com then it will generate KML attachment instead of showing the map.
So, at first, you need to make a project at https://code.google.com.
Then make an issue with you route link in a comment.
Now you can tap the link and get KML attachment.
As for Android now I am using:
Intent myIntent =
new Intent( android.content.Intent.ACTION_VIEW,
Uri.parse( "geo:0,0?q="+ lat +","+ lon ) );
startActivity(myIntent);
I think there should be something like that in iOS.
KML creation lives again!
The dead link simply needs to be modified to point to earth instead of maps
Dead Link:
http://maps.google.com/maps?q=restaurant&mrt=yp&num=10&sll=37.786945,-122.406013&radius=5&output=kml
Working Link:
https://earth.google.com/earth/rpc/search?q=restaurant&mrt=yp&num=10&sll=37.786945,-122.406013&radius=5&output=kml