live wallpaper background image is bigger than screen size andEngine - andengine

I want to use one image for all screen, but if i load bigger image than screen size it is not visible. image size is 1280x720, screen size is 960x540. What can i do?
Code:
public EngineOptions onCreateEngineOptions() {
final DisplayMetrics displayMetrics = new DisplayMetrics();
WindowManager wm = (WindowManager)getSystemService(WINDOW_SERVICE);
wm.getDefaultDisplay().getMetrics(displayMetrics);
wm.getDefaultDisplay().getRotation();
CAMERA_WIDTH = displayMetrics.widthPixels;
CAMERA_HEIGHT = displayMetrics.heightPixels;
this.mCamera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT);
return new EngineOptions(true, ScreenOrientation.PORTRAIT_FIXED, new FillResolutionPolicy(), this.mCamera);}
next:
public void onCreateResources(OnCreateResourcesCallback createResourcesCallback) throws Exception {
final BuildableBitmapTextureAtlas bitmapTextureAtlas = new BuildableBitmapTextureAtlas(
mEngine.getTextureManager(), CAMERA_WIDTH, CAMERA_HEIGHT,
TextureOptions.BILINEAR);
mTextureRegion = BitmapTextureAtlasTextureRegionFactory
.createFromAsset(bitmapTextureAtlas, this, "gfx/h720x1280a.png");
}

Its better to make scale it to support all screens:
int spriteWidth = 1280;
int spriteHeight = 720;
int spriteScaleX = spriteWidth/screenWidth;
int spriteScaleY = spriteHeight/screenHeight;
place this factor as scale to your background sprite.It will support to all devices.

You need a image that maintain the ratio of your camera width and height, Its not needed to use large resolution image , OpenGL has the capability to maintain the quality of the image.

Related

Unity: Reduce size of Render Texture before executing Texture2D.ReadPixels

I'm working on a code where I basically have to take a low quality screenshot about every 30 milliseconds. The script is attached to a camara.
What I want to do is reduce the render texture size. The way the code is right now changing either W or H basically gets me a SECTION of of all that is being seen by the camara instead of a reduced size version. So my question is how can I resized or downsample what is read into the screenshot (Texture2D) but that it still is a representation of the entire screen.
public class CameraRenderToImage : MonoBehaviour
{
private RemoteRenderServer rrs;
void Start(){
TimeStamp.SetStart();
Camera.onPostRender += OnPostRenderCallback;
}
void OnPostRenderCallback(Camera cam){
if (TimeStamp.HasMoreThanThisEllapsed(30)){
TimeStamp.SetStart();
int W = Screen.width;
int H = Screen.height;
Texture2D screenshot = new Texture2D(W,H, TextureFormat.RGB24, false);
screenshot.ReadPixels( new Rect(0, 0, W,H), 0, 0);
byte[] bytes = screenshot.EncodeToPNG();
System.IO.File.WriteAllBytes("check_me_out.png", bytes);
TimeStamp.Tok("Encode to PNG and Save");
}
}
// Remove the onPostRender callback
void OnDestroy()
{
Camera.onPostRender -= OnPostRenderCallback;
}
}
If you need to resize your render texture from script you can refer to the next code snippet
void Resize(RenderTexture renderTexture, int width, int height) {
if (renderTexture) {
renderTexture.Release();
renderTexture.width = width;
renderTexture.height = height;
}
}
To make it possible to resize the render texture you first need to make sure it is released.
To get Texture2d:
private Texture2D ToCompressedTexture(ref RenderTexture renderTexture)
{
var texture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.ARGB32, false);
var previousTarget = RenderTexture.active;
RenderTexture.active = renderTexture;
texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
RenderTexture.active = previousTarget;
texture.Compress(false);
texture.Apply(false, true);
renderTexture.Release();
renderTexture = null;
return texture;
}

Take photo in unity c#

I'm trying to build a program that takes your photo and places it with a different background, like a monument or so. So far, I was able to turn the camera on when I start the project with this code
webcamTexture = new WebCamTexture();
rawImage.texture = webcamTexture;
rawImage.material.mainTexture = webcamTexture;
webcamTexture.Play();
Texture2D PhotoTaken = new Texture2D (webcamTexture.width, webcamTexture.height);
PhotoTaken.SetPixels (webcamTexture.GetPixels ());
PhotoTaken.Apply ();
However, I can't take a screenshot or photo because it always ends up all black. I've tried different codes but nothing is working. Can someone please help? Thanks
EDIT
After some tries, this is the code I have:
WebCamTexture webcamTexture;
public RawImage rawImage;
void Start () {
webcamTexture = new WebCamTexture();
rawImage.texture = webcamTexture;
rawImage.material.mainTexture = webcamTexture;
webcamTexture.Play();
RenderTexture texture= new RenderTexture(webcamTexture.width, webcamTexture.height,0);
Graphics.Blit(webcamTexture, texture);
Button btn = yourButton.GetComponent<Button>();
btn.onClick.AddListener(OnClick);
}
public IEnumerator Coroutine(){
yield return new WaitForEndOfFrame ();
}
public void OnClick() {
var width = 767;
var height = 575;
Texture2D texture = new Texture2D(width, height);
texture.ReadPixels(new Rect(0, 0, width, height), 0, 0);
texture.Apply();
// Encode texture into PNG
var bytes = texture.EncodeToPNG();
//Destroy(texture);
File.WriteAllBytes (Application.dataPath + "/../SavedScreen.png", bytes);
}
and with this next code the screenshot is taken, but it takes a photo of the whole thing, and not just a bit of the screen.
void Start()
{
// Set the playback framerate!
// (real time doesn't influence time anymore)
Time.captureFramerate = frameRate;
// Find a folder that doesn't exist yet by appending numbers!
realFolder = folder;
int count = 1;
while (System.IO.Directory.Exists(realFolder))
{
realFolder = folder + count;
count++;
}
// Create the folder
System.IO.Directory.CreateDirectory(realFolder);
}
void Update()
{
// name is "realFolder/shot 0005.png"
var name = string.Format("{0}/shot {1:D04}.png", realFolder, Time.frameCount);
// Capture the screenshot
Application.CaptureScreenshot(name, sizeMultiplier);
}
}
You can take a screenshot like this in Unity
Application.CaptureScreenshot("Screenshot.png");
Reference
EDIT 1
To take a screenshot on a specific part of the screen use the following script:
var width = 400;
var height = 300;
var startX = 200;
var startY = 100;
var tex = new Texture2D (width, height, TextureFormat.RGB24, false);
tex.ReadPixels (Rect(startX, startY, width, height), 0, 0);
tex.Apply ();
// Encode texture into PNG
var bytes = tex.EncodeToPNG();
Destroy(tex);
File.WriteAllBytes(Application.dataPath + "/../SavedScreen.png", bytes);
Reference

Taking snapshots of a image in Unity

I am trying to take snapshots of materials I used in my application in Unity. I simply add a directional light and a camera and in a perspective mode. Then I render the result to a texture and save it as a .png file. The result is good but there is a strange gizmo like figure in the middle of image. Here it is :
Camera and light is far enough from the object. Also I disabled light to see if it is caused by directional light. But didn't solve. Anyone knows what cause this elliptic figure? Thanks in advance.
Edit. Here is the code
public static Texture2D CreateThumbnailFromMaterial(Material _material, string _name, string _path)
{
GameObject sphereObj = GameObject.CreatePrimitive(PrimitiveType.Sphere);
sphereObj.name = _name;
sphereObj.GetComponent<Renderer>().material = _material;
Texture2D thumbnailTexture = CreateThumbnailFromModel(sphereObj, _path);
sphereObj.GetComponent<Renderer>().material = null;
Object.DestroyImmediate(sphereObj.gameObject);
return thumbnailTexture;
}
public static Texture2D CreateThumbnailFromModel(GameObject _gameObject, string _path)
{
Texture2D thumbnailTexture = new Texture2D(textureSize, textureSize);
thumbnailTexture.name = _gameObject.name.Simplify();
GameObject cameraObject = Object.Instantiate(Resources.Load("SceneComponent/SnapshotCamera") as GameObject);
Camera snapshotCamera = cameraObject.GetComponent<Camera>();
if (snapshotCamera)
{
GameObject sceneObject = GameObject.Instantiate(_gameObject) as GameObject;
sceneObject.transform.Reset();
sceneObject.transform.position = new Vector3(1000, 0, -1000);
sceneObject.hideFlags = HideFlags.HideAndDontSave;
// Create render texture
snapshotCamera.targetTexture = RenderTexture.GetTemporary(textureSize, textureSize, 24);
RenderTexture.active = snapshotCamera.targetTexture;
// Set layer
foreach (Transform child in sceneObject.GetComponentsInChildren<Transform>(true))
{
child.gameObject.layer = LayerMask.NameToLayer("ObjectSnapshot");
}
// Calculate bounding box
Bounds bounds = sceneObject.GetWorldSpaceAABB();
float maxBoundValue = 0f;
if (bounds.IsValid())
{
maxBoundValue = Mathf.Max(bounds.size.x, bounds.size.y, bounds.size.z);
}
double fov = Mathf.Deg2Rad * snapshotCamera.GetComponent<Camera>().fieldOfView;
float distanceToCenter = (maxBoundValue) / (float)System.Math.Tan(fov);
cameraObject.transform.LookAt(bounds.center);
cameraObject.transform.position = bounds.center - (snapshotCamera.transform.forward * distanceToCenter);
cameraObject.transform.SetParent(sceneObject.transform);
snapshotCamera.Render();
thumbnailTexture.ReadPixels(new Rect(0, 0, textureSize, textureSize), 0, 0);
thumbnailTexture.Apply();
sceneObject.transform.Reset();
snapshotCamera.transform.SetParent(null);
RenderTexture.active = null;
GameObject.DestroyImmediate(sceneObject);
GameObject.DestroyImmediate(cameraObject);
// Save as .png
IO.IOManager.Instance.SaveAsPNG(_path + thumbnailTexture.name, thumbnailTexture);
}
return thumbnailTexture;
}
And here is my camera properties

Andengine Live wallpaper : Loosing textures on low end device

I am developing a live wallpaper using Andengine GLES2 Anchor centre branch ,based on the Development Cookbook.Wallpaper works fine on mid range to high end devices but shows problem on low end devices.I have tested it on Samsung galaxy ace , Micromax Funbook tablet and the issue generator Samsung galaxy Y. Issue found only on Samsung galaxy Y the only low end device I have.
Issue
I got loosing textures of all sprites when unlocking screens sometimes,or when returning to homepage some times,Error is not generated in a predictable manner, some times it doesn't cause any issue at all, But when it occurs to make my work even in my preview mode I have to force close the application and start the app again.
These are the details of my live wallpaper,
Wallpaper have A background sprite,A main image sprite ,two BatchedSpriteParticleSystem with some initializers and modifiers
I have a sepretae folder in asset for lower end device (320*480) where I keep small images and load all images to a single texture atlas in that case other wise I am using two texture atlas one for background image,and one for my main image and the two particle images.I am using a resource manager calss as per the andengine cookbook to load textures
Please help me to sort out the issue,I dont know where iam going wrong on this
here is my code ...
LiveWallpaperExtensionService given below
LiveWallpaperExtensionService
#TargetApi(13)
public class LiveWallpaperExtensionService extends BaseLiveWallpaperService {
public Sprite bg_Sprite;
public Sprite main_image_sprite;
public SpriteBackground background;
public BatchedSpriteParticleSystem beamParticleSystem;
public BatchedSpriteParticleSystem starParticleSystem;
private Camera mCamera;
private Scene mScene;
#Override
public org.andengine.engine.Engine onCreateEngine(
final EngineOptions pEngineOptions) {
return new FixedStepEngine(pEngineOptions, 50);
}
public EngineOptions onCreateEngineOptions() {
Display display = ((WindowManager) getSystemService(WINDOW_SERVICE))
.getDefaultDisplay();
Utils.setGlobalWidthandHeight(Utils.getDisplaySize(display));
mCamera = new Camera(0, 0, Global.Width, Global.Height);
EngineOptions engineOptions = new EngineOptions(true,
ScreenOrientation.PORTRAIT_SENSOR, new FillResolutionPolicy(),
mCamera);
engineOptions.getRenderOptions().setDithering(true);
return engineOptions;
}
public void onCreateResources(
OnCreateResourcesCallback pOnCreateResourcesCallback) {
System.out.println("On create resourses");
ResourceManager.getInstance().loadBlueTextures(mEngine, this);
pOnCreateResourcesCallback.onCreateResourcesFinished();
}
public void onCreateScene(OnCreateSceneCallback pOnCreateSceneCallback) {
System.out.println("On create scene");
mScene = new Scene();
pOnCreateSceneCallback.onCreateSceneFinished(mScene);
}
public void onPopulateScene(Scene arg0,
OnPopulateSceneCallback pOnPopulateSceneCallback) {
System.out.println("on populate ");
final float positionX = Global.Width * 0.5f;
final float positionY = Global.Height * 0.5f;
bg_Sprite = new Sprite(positionX, positionY,
ResourceManager.getInstance().mBackgroundTextureRegion,
this.getVertexBufferObjectManager());
main_image_sprite = new Sprite(positionX, positionY,
ResourceManager.getInstance().mJesusTextureRegion,
this.getVertexBufferObjectManager());
/*
* Define the center point of the particle system spawn location
*/
final int bparticleSpawnCenterX = (int) (Global.Width * 0.5f);
final int bparticleSpawnCenterY = (int) ((Global.Height * 0.5f) + ((Global.Height * 0.5f)) * 0.5f) - 25;
/* Define the radius of the circle for the particle emitter */
final float particleEmitterRadius = 50;
/* Create the particle emitter */
CircleOutlineParticleEmitter bparticleEmitter = new CircleOutlineParticleEmitter(
bparticleSpawnCenterX, bparticleSpawnCenterY,
particleEmitterRadius);
beamParticleSystem = new BatchedSpriteParticleSystem(bparticleEmitter,
10, 15, 50, ResourceManager.getInstance().mBeamTextureRegion,
mEngine.getVertexBufferObjectManager());
beamParticleSystem
.addParticleInitializer(new ExpireParticleInitializer<UncoloredSprite>(
3));
beamParticleSystem
.addParticleInitializer(new AccelerationParticleInitializer<UncoloredSprite>(
-150, 150, -150, 150));
RectangleParticleEmitter particleEmitter = new RectangleParticleEmitter(
((int) (Global.Width * 0.5f)), ((int) (Global.Height * 0.5f)),
Global.Width, Global.Height);
// Create a batched particle system for efficiency
starParticleSystem = new BatchedSpriteParticleSystem(particleEmitter,
1, 2, 20, ResourceManager.getInstance().mStarTextureRegion,
mEngine.getVertexBufferObjectManager());
/* Add an acceleration initializer to the particle system */
starParticleSystem
.addParticleInitializer(new ExpireParticleInitializer<UncoloredSprite>(
10));
starParticleSystem
.addParticleInitializer(new RotationParticleInitializer<UncoloredSprite>(
0, 160));
/* Define min/max values for the particle's scale */
starParticleSystem
.addParticleInitializer(new ScaleParticleInitializer<UncoloredSprite>(
0.3f, 1.5f));
/* Define the alpha modifier's properties */
starParticleSystem
.addParticleModifier(new AlphaParticleModifier<UncoloredSprite>(
0, 2, 0, 1));
/* Define the rotation modifier's properties */
starParticleSystem
.addParticleModifier(new RotationParticleModifier<UncoloredSprite>(
1, 9, 0, 180));
// Add alpha ('fade out') modifier
starParticleSystem
.addParticleModifier(new AlphaParticleModifier<UncoloredSprite>(
8, 10, 1, 0));
/*
* Create the SpriteBackground object, specifying the color values &
* Sprite object to display
*/
final float red = 0.7f;
final float green = 0.78f;
final float blue = 0.85f;
final float alpha = 1;
background = new SpriteBackground(red, green, blue, bg_Sprite);
mScene.setBackground(background);
mScene.setBackgroundEnabled(true);
// Attach our particle system to the scene
mScene.attachChild(starParticleSystem);
mScene.attachChild(beamParticleSystem);
mScene.attachChild(main_image_sprite);
bg_Sprite.setIgnoreUpdate(true);
main_image_sprite.setIgnoreUpdate(true);
pOnPopulateSceneCallback.onPopulateSceneFinished();
}
#Override
protected synchronized void onPause() {
System.out.println("On paused");
super.onPause();
if (starParticleSystem != null) {
starParticleSystem.setParticlesSpawnEnabled(false);
}
if (beamParticleSystem != null) {
beamParticleSystem.setParticlesSpawnEnabled(false);
}
}
#Override
protected synchronized void onResume() {
System.out.println("On resume");
super.onResume();
if (starParticleSystem != null) {
starParticleSystem.setParticlesSpawnEnabled(true);
}
if (beamParticleSystem != null) {
beamParticleSystem.setParticlesSpawnEnabled(true);
}
}
}
}
Please help me to sort out this issue ,I welcomes all ideas Suggestions, Any thing any thoughts of you to solve this issue ....
I've noticed that Galaxy Y has lots of issues, been getting lots of crash reports for a game of mine till I blocked them from downloading it and all reports stopped [it was the only device with issues]
I suggest you do the same
edit:
if you want to chose which devices to support, you can use this example
<supports-screens
android:largeScreens="true"
android:normalScreens="true"
android:smallScreens="false"
android:anyDensity="true" />
modify that as you see fit

Drawing in Canvas with GWT

I have two images that i want to draw into a canvas object. I got those images from a server and when the loadHandler invokes, i get the dimensions of the image (they have the same width and height) and calculate the dimensions of the canvas. Then i draw each image, at the calculated x,y position in canvas. The problem is that only one image appears in canvas. Why?
Here is a part of the code:
final Image siImg = new Image();
siImg.setVisible(false);
siImg.setUrl(Constants.URL_PREFIX + siPath);
siImg.addLoadHandler(new LoadHandler() {
#Override
public void onLoad(LoadEvent event) {
int siWidth = siImg.getWidth();
int siHeight = siImg.getHeight();
siImg.removeFromParent();
if (!CategoryTableView.this.dimFromBg) {
CategoryTableView.this.width = siWidth;
CategoryTableView.this.height = siHeight * sSize;
//CategoryTableView.this.setPixelSize(CategoryTableView.this.width, CategoryTableView.this.height);
CategoryTableView.this.canvas.setPixelSize(CategoryTableView.this.width, CategoryTableView.this.height);
CategoryTableView.this.canvas.setCoordinateSpaceHeight(CategoryTableView.this.height);
CategoryTableView.this.canvas.setCoordinateSpaceWidth(CategoryTableView.this.width);
CategoryTableView.this.dimFromBg = true;
}
ImageElement imageElement = (ImageElement) siImg.getElement().cast();
int left = xOff;
int top = yOff + (siHeight * fi);
CategoryTableView.this.context.drawImage(imageElement, left, top);
}
});
RootPanel.get().add(siImg);
Ok i think i find it...i have to save the context's state each time. Is that right? (Because it works now!)
You add the image to your root panel in the last line
final Image siImg = new Image();
...
RootPanel.get().add(siImg);
instead of adding your canvas. So you will only see the image instead of the canvas. You have to add the Canvas to your root panel and draw both images to your canvas. For performance-reasons it is better to draw to a backbuffer instead of drawing directly to the canvas. Here is a little example:
Canvas canvas = Canvas.createIfSupported();
Canvas backBuffer = Canvas.createIfSupported();
Context2d context = canvas.getContext2d();
Context2d backBufferContext = backBuffer.getContext2d();
Image image1 = new Image("http://your.url.to/image.jpg");
image1.addLoadHandler(new LoadHandler() {
public void onLoad(LoadEvent event) {
// do anything you want here
doDraw();
}
});
Image image2 = new Image("http://your.url.to/image2.jpg");
image2.addLoadHandler(new LoadHandler() {
public void onLoad(LoadEvent event) {
// do anything you want here
doDraw();
}
});
RootPanel.get().add(canvas);
And the draw-method would look like this:
public void doDraw() {
backBufferContext.setFillStyle(redrawColor);
backBufferContext.fillRect(0, 0, width, height);
ImageElement imageElement = ImageElement.as(image1.getElement());
backBufferContext.drawImage(imageElement, 0, 0, 1024, 768, 0, 0, 102, 76);
ImageElement imageElement = ImageElement.as(image2.getElement());
backBufferContext.drawImage(imageElement, 0, 0, 1024, 768, 102, 76, 102, 76);
context.drawImage(backBufferContext.getCanvas(), 0, 0);
}
Please note: You have to use global variables in this example. Change this to your needs by either passing the arguments/classes or defining the variables class-wide. Also the drawing-areas are hardcoded in this example: this you have also to change to your needs.