Andengine examples shows error not working - andengine

I am very new to andengine, trying to use this andengine examples from git url : https://github.com/nicolasgramlich/AndEngineExamples
But it always shows me error on two classes
BoundCameraExample and in HullAlgorithmExample
In bound camera example error is in line 220 says :
Type mismatch: cannot convert from void to AnimatedSprite
final AnimatedSprite face = new AnimatedSprite(pX, pY, this.mBoxFaceTextureRegion, this.getVertexBufferObjectManager()).animate(100);
and in HullAlgorithmExample error is on import statement of DrawMode
error shows on line number 11 : import org.andengine.entity.primitive.vbo.DrawMode;
and in lines 168 , 175 says DrawMode cannot be resolved to a variable
I am using java compiler 1.6 for all extensions
I downloaded andengine and extensions from the same git repo.
What is going wrong with this
please help me
Thanks to allll

In the BoundCameraExample, try
final AnimatedSprite face = new AnimatedSprite(pX, pY, this.mBoxFaceTextureRegion, this.getVertexBufferObjectManager());
face.animate(100);
instead of
final AnimatedSprite face = new AnimatedSprite(pX, pY, this.mBoxFaceTextureRegion, this.getVertexBufferObjectManager()).animate(100);
In the HullAlgorithExample, import
import org.andengine.entity.primitive.DrawMode;
instead of
import org.andengine.entity.primitive.vbo.DrawMode;

In 2014 the answer of #swati-rawat is still heplful. I found further two issues, I'll report here the solution since this question is well ranked:
in SplitScreenExample, as in BoundCameraExample, replace with:
final AnimatedSprite face = new AnimatedSprite(pX, pY, this.mBoxFaceTextureRegion, this.getVertexBufferObjectManager());
face.animate(100);
in TextBreakExample replace with:
this.mText = new Text(50, 40, this.mFont, "", 1000, new TextOptions(AutoWrap.LETTERS, AUTOWRAP_WIDTH, HorizontalAlign.CENTER, Text.LEADING_DEFAULT), vertexBufferObjectManager);

Related

Expected an assignment or function call and instead saw an exp

enter image description herevoid draw();{
//background(0);
PVector mouse = new PVector(mouseX, mouseY);
PVector center = new PVector(width/2, height/2);
mouse.sub(center);
translate(width/2, height/2);
On the PVector lines I am getting the same expected an assignment error. I searched the solution via google and here but their codes seem formated differently than what I have here. Any ideas as to what might be going on here? I am totally new to coding.

How to fix Avrdude error when uploading to Nano

I get this error:
avrdude: stk500_recv(): programmer is not responding
avrdude: stk500_getsync() attempt 1 of 10: not in sync: resp=0x00
avrdude: stk500_recv(): programmer is not responding
avrdude: stk500_getsync() attempt 2 of 10: not in sync: resp=0x00
When I upload this code to my Nano:
#include <Servo.h>
Servo myservo;
int potpin = A0;
int val;
void setup() {
myservo.attach(9);
}
void loop() {
val = analogRead(potpin);
val = map(val, 0, 1023, 0, 180);
myservo.write(val);
delay(15);
}
The Arduino help center says I might have selected the wrong board and port. So I reselected the Nano board and put in COM3. I also reinstalled the CH340 driver as my Arduino is a clone. I’ve tried reuploading and restarting but nothing seems to work and I get the same error. Anyone know how to fix this?
Any help is much appreciated.

Using nDisplay with Vive Trackers in UE4

I have a question for anyone that's been using nDisplay with vive trackers.
I set up using Ben Kidd's videos on youtube with the nDisplay new project template and created this blueprint: https://blueprintue.com/blueprint/fgg1zcub/ by creating a blueprint subclass of DisplayClusterRootActor
I put the GetVRPNTrackerLocation not being a function down to being in a different version of UE (4.26)
In VRPN, I'm getting the following data from the controller (not using a tracker atm):
Tracker openvr/controller/LHR-F7FD3B46#192.168.0.41:3884, sensor 0:
pos (-0.08, 0.78, -0.36); quat ( 0.20, 0.07, -0.15, 0.96)
Tracker openvr/controller/LHR-F7FD3B46#192.168.0.41:3884, sensor 0:
pos (-0.08, 0.78, -0.36); quat ( 0.20, 0.07, -0.16, 0.96)
...
and that's coming through in my Print String
so I know that the data is passing through from the controller -> VRPN -> UE4 / nDisplay and it looks similar to Ben's (numbers from -2ish to 2ish)
lastly in my nDisplay cfg i have (alongside my monitor setup):
...
[camera] id="camera_static" loc="X=0,Y=0,Z=0" parent="eye_level" eye_swap="false" eye_dist="0.064" force_offset="0" tracker_id="ViveVRPN" tracker_ch="0"
[input] id="ViveVRPN" type="tracker" addr="openvr/controller/LHR-F7FD3B46#192.168.0.41:3884" loc="X=0,Y=0,Z=0" rot="P=0,Y=0,R=0" front="-Z" right="X" up="Y"
...
however the movement of the camera is really tiny and not representative of the actual camera movements.
Finally it's not only me with this issue.
The issue is that seemingly the GetVRPNTrackerLocation only takes into consideration the first axis, assigns it to front, and sets the rest to positive X.
As for the underlying problem, I have no idea where this happens, but since I needed a quick fix I just hardcoded these values into engine code, and didn't look for a more permanent fix.
So in case you want to do this workaround here's what I did:
Follow these steps to obtain a source version of Unreal 4.26 (I would recommed the 4.26 branch)
https://docs.unrealengine.com/en-US/ProductionPipelines/DevelopmentSetup/BuildingUnrealEngine/index.html
Find {UnrealSourceFolder}\Engine\Plugins\Runtime\nDisplay\Source\DisplayCluster\Private\Input\Devices\VRPN\Tracker\DisplayClusterVrpnTrackerInputDevice.cpp
Modify these two methods
FVector FDisplayClusterVrpnTrackerInputDevice::GetMappedLocation(const FVector& Loc, const AxisMapType Front, const AxisMapType Right, const AxisMapType Up) const
{
static TLocGetter funcs[] = { &LocGetX, &LocGetNX, &LocGetY, &LocGetNY, &LocGetZ, &LocGetNZ };
//return FVector(funcs[Front](Loc), funcs[Right](Loc), funcs[Up](Loc));
return FVector(funcs[AxisMapType::NZ](Loc), funcs[AxisMapType::X](Loc), funcs[AxisMapType::Y](Loc));
}
FQuat FDisplayClusterVrpnTrackerInputDevice::GetMappedQuat(const FQuat& Quat, const AxisMapType Front, const AxisMapType Right, const AxisMapType Up, const AxisMapType InAxisW) const
{
static TRotGetter funcs[] = { &RotGetX, &RotGetNX, &RotGetY, &RotGetNY, &RotGetZ, &RotGetNZ, &RotGetW, &RotGetNW };
//return FQuat(funcs[Front](Quat), funcs[Right](Quat), funcs[Up](Quat), -Quat.W);// funcs[axisW](quat));
return FQuat(funcs[AxisMapType::NZ](Quat), funcs[AxisMapType::X](Quat), funcs[AxisMapType::Y](Quat), -Quat.W);// funcs[axisW](quat));
}
So I just upgraded from a working 4.25 version to 4.26 and then came across the same issue. I then built the engine and debugged my way through to find the cause and a possible solution for this problem.
It seems like it is a problem with the .cfg text files and the axes getting parsed incorrectly.
An easy solution would be to import the .cfg file into the editor so it gets then converted to the new .json file format. Here you can then see the issue with wrong assigned axes on the tracker and can also change it in the new file. Afterwards just use the new .json file from then on for your ndisplay configuration and it should work correctly.

VectorGraphics Unity3d - get reference to imported SVG in code

I am using Unity's Vector Graphics package to import a simple svg with 3 shapes.
Importing works successfully:
However, once imported - how do I get a reference to the svg - so that I can modify shapes and fills?
The documentation shows how to render (after making changes) but I don't understand how to first get the reference to the imported svg?
Even seeing some sample code that has been implemented would be enough but I can't find any online.
The way to do this would be to parse the SVG file to get a vector representation of the file (vector Scene), then you can change any properties before tessellation. For example:
string svg =
#"<svg xmlns=""http://www.w3.org/2000/svg"" xmlns:xlink=""http://www.w3.org/1999/xlink"" viewBox=""0 0 216 216"">
<g>
<polygon id=""Poly1"" points=""...""/>
</g>
</svg>";
// Import the SVG at runtime
var sceneInfo = SVGParser.ImportSVG(new StringReader(svg));
var shape = sceneInfo.NodeIDs["Poly1"].Shapes[0];
shape.Fill = new SolidFill() { Color = Color.red };
// Tessellate
var geoms = VectorUtils.TessellateScene(sceneInfo.Scene, tessOptions);
// Build a sprite
var sprite = VectorUtils.BuildSprite(geoms, 10.0f, VectorUtils.Alignment.Center, Vector2.zero, 128, true);
GetComponent<SpriteRenderer>().sprite = sprite;
See source answer

Display points when there's great distance between them (GWT-Openlayers)

The case is the following: I have a layer and there are two points on it. The first is in Australia, the second is in the USA. The continent or the exact position of the points doesn't count. The essential part is the great distance between the points. When the application starts, the first point appears (zoomlevel is 18). The second point isn't displayed because it is far away from here and the zoomlevel is high. Then i call the panTo function with the location of the second point. The map jumps to the right location but the second point doesn't appear. The point appears only if i zoom in/out or resize the browser window. The GWT code:
LonLat center = new LonLat(151.304485, -33.807831);
final LonLat usaPoint = new LonLat(-106.356183, 35.842721);
MapOptions defaultMapOptions = new MapOptions();
defaultMapOptions.setNumZoomLevels(20);
// mapWidget
final MapWidget mapWidget = new MapWidget("100%", "100%", defaultMapOptions);
// google maps layer
GoogleV3Options gSatelliteOptions = new GoogleV3Options();
gSatelliteOptions.setIsBaseLayer(true);
gSatelliteOptions.setDisplayOutsideMaxExtent(true);
gSatelliteOptions.setSmoothDragPan(true);
gSatelliteOptions.setType(GoogleV3MapType.G_SATELLITE_MAP);
GoogleV3 gSatellite = new GoogleV3("Google Satellite", gSatelliteOptions);
mapWidget.getMap().addLayer(gSatellite);
// pointLayer
VectorOptions options = new VectorOptions();
options.setDisplayOutsideMaxExtent(true);
Vector vector = new Vector("layer1", options);
mapWidget.getMap().addLayer(vector);
mapWidget.getMap().addControl(new LayerSwitcher());
mapWidget.getMap().addControl(new MousePosition());
mapWidget.getMap().addControl(new ScaleLine());
mapWidget.getMap().addControl(new Scale());
// two points are added to the layer
center.transform(new Projection("EPSG:4326").getProjectionCode(), mapWidget.getMap().getProjection());
vector.addFeature(new VectorFeature(new Point(center.lon(), center.lat())));
usaPoint.transform(new Projection("EPSG:4326").getProjectionCode(), mapWidget.getMap().getProjection());
vector.addFeature(new VectorFeature(new Point(usaPoint.lon(), usaPoint.lat())));
// the center of the map is the first point
mapWidget.getMap().setCenter(center, 18);
// 3 sec later panTo second point
Timer t = new Timer() {
#Override
public void run() {
mapWidget.getMap().panTo(usaPoint);
}
};
t.schedule(3000);
I tried to reproduce this situation with pure Openlayers, but it worked fine. Here is the link
So i think the problem is with GWT-Openlayers. Has anybody experienced such behaviour? Or has anybody got a solution to this problem?
What a strange problem.
For now I did only found a way around it, but not a real fix. Seems to be a bug in GWT-OL as you say, but I can't imagine where.
What you can do is add the following 3 lines to your code :
mapWidget.getMap().panTo(usaPoint);
int zoom = mapWidget.getMap().getZoom();
mapWidget.getMap().setCenter(usaPoint, 0);
mapWidget.getMap().setCenter(usaPoint, zoom);
(note : I am a contributor to the GWT-OL project, I also informed other contributors of this problem, maybe they can find a better solution)
Edit : Another GWT-OL contributor looked into this but also couldn't find a real solution
but another workaround is to use zoomToExtend for the requested point :
Bounds b = new Bounds();
b.extend(new LonLat(usaPoint.getX(), usaPoint.getY()));
mapWidget.getMap().zoomToExtent(b);