I've been working on adding a view from camera in LibGDX application for Android. I followed this tutorial: code.google.com/p/libgdx/wiki/DeviceCameraIntegration
The view from camera is working properly on every device we've tested, but on some of them (it may depend on the Android version) there's a problem with textures. Instead of transparent backround, grey backround is displayed (on some phones however, there's a black border instead of grey background).
I've realized that this bug is caused by changing format from SurfaceView to translucent:
Code:
SurfaceView glView = (SurfaceView) graphics.getView();
glView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
I've decided to change format to translucent only in this part of app, which displays a view from camera - and after turning it off, it goes back to standard opaque format. However this causes other problems: switching to translucent and turning camera on works, but going back to menu and switching displaying to opaque results only with a black screen (application code is executing, I can hear the music but the app doesn't react on input. By trial and error it turned out that on some of devices, for the reason unknown to me, the same code works properly:
Code:
public void setSurfaceFormatToTranslucent() {
try {
if ((this.graphics.getView() instanceof SurfaceView)) {
SurfaceView glView = ((SurfaceView)this.graphics.getView());
glView.setKeepScreenOn(false); //** These two lines I add only during changing to tanslucent
glView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
glView.setKeepScreenOn(true); //**
}
} catch (Exception e) {
e.printStackTrace();
}
}
public void setSurfaceFormatToOpaque() {
try {
if ((this.graphics.getView() instanceof SurfaceView)) {
SurfaceView glView = ((SurfaceView)this.graphics.getView());
glView.getHolder().setFormat(PixelFormat.OPAQUE);
}
} catch (Exception e) {
e.printStackTrace();
}
}
However on some of devices this still causes bugs similar to those mentioned above.
Does anyone know what's the problem and how to change SurfaceView format properly while the application is running?
Related
I'm currently developing an app and from within the app I'd like to let the user select custom images to use.
Right now I use
Code:
public final int GOT_IMAGE =1;
private void getImage() {
Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType("image/*");
mUri = Uri.fromFile(new
File(Environment.getExternalStorageDirectory(),"temp_image" +
".jpg"));
intent.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, mUri);
try {
intent.putExtra("return-data", true);
startActivityForResult(intent,GOT_IMAGE );
} catch (ActivityNotFoundException e) {
e.printStackTrace();
}
}
protected void onActivityResult(int requestCode, int resultCode,
Intent data) {
if (resultCode != RESULT_OK) {
customImgChk.setChecked(false);
return;
}
if (requestCode == GOT_IMAGE) {
Bitmap image = BitmapFactory.decodeFile(mUri.getPath());
if (image!=null)
{
image = WPUtil.resizeBitmap(image, WPUtil.IMAGE_SIZE_X,
WPUtil.IMAGE_SIZE_Y);
}
else
{
customImgChk.setChecked(false);
Toast.makeText(this.getApplicationContext(), "Failed to grab
image!", Toast.LENGTH_LONG).show();
}
}
}
This works great on a few devices but not all. I'd really like to come up with a universal way to perform this function but I have not found a way to do it yet.
I've thought about writing up my own image selector and cropper but I'd rather not re-invent the wheel.
Can anybody suggest a decent app/library that I can use to select and or crop photos from within my app?
ImageJ is great. It is a stand alone app but there are published apis for integrating into your own apps.
*its better than that, its open source and you can use the jar and just ignore the gui library
Might look into that
http://rsbweb.nih.gov/ij/
From something awesome
Hi,
im playing around with the Camera Service. The preview mode works well, but if I take a picture and save it, it is a lot darker as in the preview. I tried it with the delivered camera app and there is no such problem.
I tried it with many diffrent values (iso, brightness, scenemode, whitebalance), but nothing changed, but the preview. It seems, that the preferences only have effect in the previewmode.
Device: altek A14 leo
OS: Android 2.1
Picture saved by the standard camera app:
abload.de/img/stdapp8uql.jpg
Preview mode:
abload.de/img/previewrne6.jpg
Picture saved by my app:
abload.de/img/myappiugc.jpg
parameters of the camera
Code:
atk-frame=0
brightness-max=12
brightness-min=0
brightness=6
camera-id=1
contrast-max=2
contrast-min=0
contrast=1
effect-values=none,mono,sepia,whiteboard
effect=none
flash-mode-values=off,auto,on,red-eye
flash-mode=off
focus-mode-values=auto,infinity
focus-mode=auto
gps-altitude=0
gps-latitude=0.0
gps-longitude=0.0
gps-timestamp=1199145600
iso-values=auto,80,100,200,400,800,1600,3200
iso=auto
jpeg-quality=100
jpeg-thumbnail-height=384
jpeg-thumbnail-quality=100
jpeg-thumbnail-width=512
max-zoom=9
metering-values=spot,center,matrix
metering=center
min-zoom=0
orientation=landscape
picture-format-values=jpeg
picture-format=jpeg
picture-size-values=2048x1536,1280x960
picture-size=2048x1536
preview-format-values=yuv420sp
preview-format=yuv420sp
preview-frame-rate-values=15
preview-frame-rate=15
preview-size-values=640x480
preview-size=640x480
rotation=0
saturation-max=0
saturation-min=2
saturation=1
scene-mode-values=auto,portrait,landscape,night,beach,snow,sunset,fireworks,sports,candlelight
scene-mode=auto
SD9-poweron-mode=0
sharpness-max=2
sharpness-min=0
sharpness=1
smooth-zoom-supported=true
whitebalance-values=auto,fluorescent,daylight,cloudy-daylight
whitebalance=auto
zoom-factors=100,130,160,190,220,250,280,310,340,360,400
zoom-supported=true
zoom=0
Code:
private OnClickListener cameraClickListener = new OnClickListener() {
public void onClick(View v) {
inPreview = false;
cam.takePicture(null, null, photoCallback);
}
};
PictureCallback photoCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
new SavePhotoTask().execute(data);
startPreview();
}
};
class SavePhotoTask extends AsyncTask<byte[], String, String> {
protected String doInBackground(byte[]... imageData) {
File photo = new File(Environment.getExternalStorageDirectory(), "photo.jpg");
if (photo.exists()) {
photo.delete();
}
try {
FileOutputStream fos = new FileOutputStream(photo.getPath());
fos.write(imageData[0]);
fos.close();
Log.i("IMAGE SAVED ASYNC", photo.getPath());
} catch (java.io.IOException e) {
Log.e("Error", "Exception in photoCallback", e);
}
return (null);
}
}
Does anybody have an idea, what causes this behaviour?
thanks.
rgds
I am not sure what happening with you. But as far as your codes it should work well. try other camera apps. Like 360. But problem persist then go to service center. The problem is with hardware.
Hi,
sorry for the late response. It was no hardware issue. I just had to call autoFocus and takePicture in the AutoFocusCallback.
Code:
private OnClickListener cameraClickListener = new OnClickListener() {
public void onClick(View v) {
if (!isFocusing) {
isFocusing = true;
cam.autoFocus(autoFocusCallback);
}
}
};
AutoFocusCallback autoFocusCallback = new AutoFocusCallback() {
public void onAutoFocus(boolean success, Camera camera) {
if (success) {
isFocusing = false;
camera.takePicture(null, null, photoCallback);
}
}
};
regards
Code:
tog = (ToggleButton) findViewById(R.id.bDi); // Toggle button
tog.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
[user=439709]@override[/user]
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
// TODO Auto-generated method stub
while (tog.isChecked()){
sendData("A");
try {
Thread.sleep(300);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
sendData("B");
try {
Thread.sleep(300);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
sendData("C");
try {
Thread.sleep(300);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
});
Basically I am interfacing hardware with an android device and I have to send some character one by one so in this code,
When I push the toggle button, the loop executes infinitely and rest of my program doesn’t work, even the toggle button is not pushed again.
I want to keep this loop executing until I have unchecked the toggle button, If I use ‘if’ instead of ‘while’ , this loop runs for only one time, again I have to uncheck and the check the toggle button to make it run for next time.
You need to put the while-loop into another thread.
As long as code is running in the UI-Thread, the user cannot do anything with the UI. It is frozen.
(Every listener of the standard Android classes is running on the UI Thread.)
So the solution should be putting that in an AsyncTask. (Don't use Threads here, even though I told you so. They do not give you access to the UI. You would need to use Handlers for the UI things. Use an AsyncTask instead. AsyncTask does everything for you. It is a wrapper class for Thread and Handler and has its own ThreadPool.)
I've been trying to learn to program for Android lately by simply coming up with ideas and implementing them. It started out easy enough with pulling information from an online RSS feed and showing this in a nice environment. But the current idea has me stumped.
I'd like the following to happen:
Take a picture using intent
Entire picture is shown in a new activity
Zoom in on a certain spot
Add predefined items to the picture
Press next which connects the items from left to right
Add some more items
Press next to connect the new items
Zoom out
Save the image
First taking a picture, this wasn't too hard using the camera intent:
Code:
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
fileUri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE);
intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri);
I can then extract the absolute path from fileUri with the following line:
Code:
String path = new File(fileUri.getPath()).getAbsolutePath();
This path can be put in a bundle which can be put in an intent which is then used to start the activity that should show the image.
Code:
public class TestActivity extends Activity implements SurfaceHolder.Callback {
private static final String TAG = TestActivity.class.getSimpleName();
private String path = "";
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SurfaceView view = new SurfaceView(this);
setContentView(view);
Intent intent = getIntent();
Bundle bundle = intent.getExtras();
path = bundle.getString("path");
view.getHolder().addCallback(this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
Canvas canvas = holder.lockCanvas();
if (canvas == null) {
Log.d(TAG,"Can't draw because canvas is null");
}
else {
Bitmap bitmap = BitmapFactory.decodeFile(path);
Paint paint = new Paint();
if(bitmap == null) {
Log.d(TAG,"Can't draw because bitmap is null");
}
else {
canvas.drawBitmap(bitmap,0,0,paint);
}
holder.unlockCanvasAndPost(canvas);
}
}
@Override
public void surfaceChanged(SurfaceHolder holder, int frmt, int w, int h) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
}
The first issue here is that it of course doesn't show the entire photo. Not that surprising considering it's larger than the view. Ideally I'd like to zoom out to show the entire photograph. Once I've zoomed one way I'd assume that zooming in on the part that you want should also be possible.
Next is adding the objects. My idea was simply catching any touch events and adding a new object once the finger is released. This way I'd end up with a list of items with each having a draw function which can be called through the surfaceview when it is redrawn.
Connecting these items could simply be done by creating a line object and going through the list of all items and using their locations for the begin and endpoints of the lines
One of the big issues here is that the x and y locations would be relative to the screen, not to the photo. Which would mean that when you zoom back out the entire background would change but the actual items would remain at the same spot and in the same size.
I've been searching and searching for any tutorial or other question about the same issue, but either I've had no luck or I've been using the wrong keywords. And for all I know everything I have up till now could be wrong.
If anyone could give some pointers, or maybe knows a guide or tutorial somewhere or some better keywords I could use for searching I'd really appreciate it.
Xylon- said:
One of the big issues here is that the x and y locations would be relative to the screen, not to the photo. Which would mean that when you zoom back out the entire background would change but the actual items would remain at the same spot and in the same size.
Click to expand...
Click to collapse
would you just not need to either control or track the sample/scale if the image so that you know the 1st pixel displayed (top left) and the scale factor, then the eventX/Y can be processed to be relative to what you want ?
Just started using the Camera2 framework because of the increased control it provides over the low-level functions of the camera. However, I am having some trouble turning the flashlight on and off quickly. With the old Camera API, I could toggle flash while supplying a preview by:
Code:
try
{
android.hardware.Camera.Parameters parameters = c.getParameters();
if (parameters.getFlashMode().equals(Camera.Parameters.FLASH_MODE_OFF))
{
parameters.setFlashMode(Parameters.FLASH_MODE_TORCH);
Log.i("HeartBeatAlgorithm", "LightOn");
}
else if (parameters.getFlashMode().equals(Camera.Parameters.FLASH_MODE_TORCH))
{
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
Log.i("HeartBeatAlgorithm", "LightOff");
}
c.setParameters(parameters);
}
catch (Exception exception)
{
c.release();
c = null;
}
And the flashlight would quickly turn on or off, without any noticeable interrupt. With Camera2, however, it seems as though flash mode is a property of the CaptureSession, meaning an entirely new CaptureSession needs to be created to change flash mode, i.e.:
Code:
try
{
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(texture);
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
if (mLightNowOn == true)
{
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_TORCH);
}
else
{
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
}
mPreviewRequestBuilder.addTarget(surface);
mPreviewRequest = mPreviewRequestBuilder.build();
mCameraDevice.createCaptureSession(Arrays.asList(surface), mSessionStateCallback, null);
}
catch (CameraAccessException e)
{
e.printStackTrace();
}
As is mentioned in the developer docs, "Creating a session is an expensive operation and can take several hundred milliseconds, since it requires configuring the camera device's internal pipelines and allocating memory buffers for sending images to the desired targets." It definitely does, and there is a noticeable delay in my app when toggling flash mode.
I really need to be able to quickly toggle flash modes without interrupting the preview so much. Is there any way around this, or is it unavoidable due to the new API pipeline?