Is it possible to get the roughness value of the impact point(if there are any) on any material using ray-trace.
// shooting laser
GetWorld()->LineTraceSingleByChannel(
HitInfo,
LidarBodyLoc,
EndTrace,
ECC_MAX,
TraceParams,
FCollisionResponseParams::DefaultResponseParam
);
auto row = *(HitInfo).GetComponent()->GetMaterial(0)->GetBaseMaterial()->Roughness;
The above code generates errors. Is there any other way I can get roughness or is there any flaw in the code I use?
You can try like this:
FScalarMaterialInput& Input = HitInfo.GetComponent()->GetMaterial(0)->GetBaseMaterial()->Roughness;
or
auto Input = &(HitInfo.GetComponent()->GetMaterial(0)->GetBaseMaterial()->Roughness);
but UMaterial::Roughness is a WITH_EDITORONLY_DATA property, so it may cause compilation error since Editor data would be stripped when building package.
Related
I'm working on reading a STEP file into my C++ application, translating it into OCCT shapes, and displaying them using VTK. Everything seems to be working fine EXCEPT the scale. The shape I'm importing is just under 20 mm in diameter according to a 3D viewer (which matches what it was when I exported it from my app), but when I import it, it displays as the expected shape but at just under 20 meters in diameter. The program uses meters internally; OCCT appears to be ignoring units during import.
My program runs the following on initialization:
STEPControl_Controller::Init();
Interface_Static::SetCVal("xstep.cascade.unit","M");
The import code is:
STEPControl_Reader rdr;
IFSelect_ReturnStatus ret = rdr.ReadFile(filname);
if (ret == IFSelect_RetDone)
{
int navail = rdr.NbRootsForTransfer();
debugPrint("Found ",std::to_string(navail)," roots available");
int nroots = rdr.TransferRoots();
debugPrint("Transferred ",std::to_string(nroots)," roots");
TopoDS_Shape s = rdr.OneShape();
// Process the returned shape for display after this...
}
I've tried changing the value of "xstep.cascade.unit", and it has no effect on the size of the imported shape. It works as expected when exporting OCCT shapes to a STEP file, but not importing. Is there some parameter or initialization step I'm missing? We're using OCCT version 7.6.0.
ADDENDUM: I did check inside the STEP file, and the units are recorded with lines like:
#68 = ( LENGTH_UNIT() NAMED_UNIT(*) SI_UNIT(.MILLI.,.METRE.) );
So it shouldn't be a matter of not knowing what it's translating from.
I made PNGs for custom markers on my GoogleMap view. By using e.g.:
BitmapDescriptor bikeBlack = await BitmapDescriptor.fromAsset(const ImageConfiguration(), "assets/images/bike_black.png")
I obtain an object that I can use as a marker directly. However, I need to be able to change the brighness as well for about 50 markers of 4 different types, during runtime. The only possible solution I have come up with so far is creating 1024 different PNGs. This will increase app size by about 2MB but it might be a lot of work to do..
I cannot really afford using await statements since they slow the app down considerably. But if I have to, I can force myself to live with that.
As far as I can tell, a marker icon has to be a BitmapDescriptor. But I cannot find a way to change the brightness of such a BitmapDescriptor.
I'm close to just giving up and just writing a python script that will generate the 1,024 PNGs for me. But there must be a nicer and more efficient solution. If you have one, please let me know.
[EDIT]:
I went with creating 1024 images. For anybody in the same situation, this is the script I used:
from PIL import Image, ImageEnhance
img = Image.open("../img.png")
enhancer = ImageEnhance.Brightness(img)
for i in range(256):
img_output = enhancer.enhance(i / 255)
img_output.save("img_{}.png".format(i), format="png")
I want to scroll ball texture to show its moving so I have written this kind of code:
// ball texture rolling
textureOffset.x -= myRigidBody.velocity.normalized.z * (speed / 500f);
textureOffset.y = myRigidBody.velocity.normalized.x * (speed / 40);
myMaterial.mainTextureOffset = Vector2.Lerp (myMaterial.GetTextureOffset (1), textureOffset, speed * Time.fixedDeltaTime);
I was getting this kind of error during game play in Unity Editor.
This kind of material assigned to ball object:
I have just upgraded Unity version 2017.3.1p4 and error started coming, I don't know what to do now. Give me suggestions to solve this.
There is a couple of reasons for this bug, so here is a few suggestions :
1-make sure the shader you are using has the texture "_RendererColor", look for something like this
Properties {
_RendererColor ("Base (RGB)", 2D) = "white" {}
}
maybe they might have changed it's name in the update
2-make sure your shader has the appropriate include file like for example
include "UnityCG.cginc"
maybe some functionality requires certain includes after the update
3-this could just be a random bug after unity crashes or after a update, I have had some friends have their unity crash for them then they get bugs like this, maybe updating unity might do something similar, maybe try to remake the shader/create a new one and replace this one.
This is all I can think of
I am writing some editor extensions which setup big complex mechanim blend trees and state machines. I have no problem creating layers, states, and attaching motions to blend trees via editor scripts, etc...
However, I can not find any calls that allow me to setup a Motion's speed setting. Is this possible?
While there's no direct API call to set the speed of a motion in a blend tree it is possible to do so via Unity's SerializedObject/SerializedProperty API. Something like this ought to do it:
var serialised = new SerializedObject(blendTree);
var children = serialised.FindProperty("m_Childs");
var child = children.GetArrayElementAtIndex(0);
var timeScale = child.FindPropertyRelative("m_TimeScale");
timeScale.floatValue = speed;
serialised.ApplyModifiedProperties();
I think what you're looking for is AnimatorState.speedParameter :
https://docs.unity3d.com/ScriptReference/Animations.AnimatorState-speedParameter.html
This thread show how to use it and has also interesting workarounds in case it's not enough https://forum.unity3d.com/threads/mecanim-change-animation-speed-of-specific-animation-or-layers.160395/page-2
I am trying to persist markers in an augmented reality game. Here is the gist of what I am doing:
I have my users recording and saving an area to an ADF. Then they drop marker’s into the scene and save out their position data in Unity World coordinates to a text file. I then restart the app, load and localize to the ADF and load the markers.
In order to get this working, I've modified the ARPoseController.cs file in the Unity demo package to use the Area Description as it's base frame. In the _UpdateTransformation method I've swapped out the frame pairs
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_START_OF_SERVICE;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
for
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_AREA_DESCRIPTION;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
I've also added some code confirming that I'm successfully localizing to the ADF, but I'm noticing that my markers position in Unity World Space do not position properly relative to real environment.
I can confirm that my markers save and load properly based on START_OF_SERVICE origin so I assume that they are properly serializing and deserializing. What could be causing this? Am I wrong in assuming this should just work by switching the base framepair to Area_Description instead of START_OF_SERVICE?
I had a similar problem getting the AR and ADF integrated, I had to modify the TangoPointCloud to check if you're using an AreaDescription in OnTangoDepthAvailable() and adjust the baseFrame target as required.
i.e.:
if (m_tangoDeltaPoseController.m_useAreaDescriptionPose)
{
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_AREA_DESCRIPTION;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
}
else
{
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_START_OF_SERVICE;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
}
That way, the geometry of the point cloud adjusts itself based on the ADF offset instead of from device start.
After that change, when I'm using the sample code for AR to drop markers, it registers the surface properly so I'm placing the markers in the correct spots and orientation. I'm still encountering some flakiness with the markers not adjusting when relocalized though, have to look into the AreaLearningInGameController for loop closure events.
Hope that helps!