Android Camera picture not showing in ImageView











up vote
0
down vote

favorite












I am trying to take a picture with the camera of "Nexus 5X API 26" and show it in ImageView field, before uploading it to Firebase.



First problem is that after taking a picture, it does not show up in imageView. I am doing the following:



Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (intent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(intent, CAMERA_REQUEST_CODE);}


And then I was trying to show the picture in the same way I do for the pictures taken from Gallery:



filePath = data.getData();
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), filePath);
imageView.setImageBitmap(bitmap);
} catch (IOException e) {
e.printStackTrace();
}


The part which I do not understand is how data.getData() works in both cases (for Gallery and for Camera)?



I guess that the uploadImage() method, should be the same for both Gallery and Camera uploads (it works for Gallery already so...).



So the thing that I am currently missing is that I am not getting the filePath, I guess?
Is it necessary to "temporarily save" the camera's picture in order to .getData()? Or it can work without any kind of "saving"?



I just want the user to take a picture and it should be sent to Firebase. Not necessary for user to see it in imageView first, just to get the uri (data) in order to upload it.



view when i open the camera
view after taking a picture










share|improve this question
























  • There's probably an Exception being thrown. Check what your catch block prints.
    – TheWanderer
    Nov 9 at 15:21










  • getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
    – Alex Cohn
    Nov 10 at 6:38










  • Thanks, the posts + further refinement worked out in the end :)
    – Marko Otasevic
    Nov 23 at 13:06















up vote
0
down vote

favorite












I am trying to take a picture with the camera of "Nexus 5X API 26" and show it in ImageView field, before uploading it to Firebase.



First problem is that after taking a picture, it does not show up in imageView. I am doing the following:



Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (intent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(intent, CAMERA_REQUEST_CODE);}


And then I was trying to show the picture in the same way I do for the pictures taken from Gallery:



filePath = data.getData();
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), filePath);
imageView.setImageBitmap(bitmap);
} catch (IOException e) {
e.printStackTrace();
}


The part which I do not understand is how data.getData() works in both cases (for Gallery and for Camera)?



I guess that the uploadImage() method, should be the same for both Gallery and Camera uploads (it works for Gallery already so...).



So the thing that I am currently missing is that I am not getting the filePath, I guess?
Is it necessary to "temporarily save" the camera's picture in order to .getData()? Or it can work without any kind of "saving"?



I just want the user to take a picture and it should be sent to Firebase. Not necessary for user to see it in imageView first, just to get the uri (data) in order to upload it.



view when i open the camera
view after taking a picture










share|improve this question
























  • There's probably an Exception being thrown. Check what your catch block prints.
    – TheWanderer
    Nov 9 at 15:21










  • getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
    – Alex Cohn
    Nov 10 at 6:38










  • Thanks, the posts + further refinement worked out in the end :)
    – Marko Otasevic
    Nov 23 at 13:06













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am trying to take a picture with the camera of "Nexus 5X API 26" and show it in ImageView field, before uploading it to Firebase.



First problem is that after taking a picture, it does not show up in imageView. I am doing the following:



Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (intent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(intent, CAMERA_REQUEST_CODE);}


And then I was trying to show the picture in the same way I do for the pictures taken from Gallery:



filePath = data.getData();
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), filePath);
imageView.setImageBitmap(bitmap);
} catch (IOException e) {
e.printStackTrace();
}


The part which I do not understand is how data.getData() works in both cases (for Gallery and for Camera)?



I guess that the uploadImage() method, should be the same for both Gallery and Camera uploads (it works for Gallery already so...).



So the thing that I am currently missing is that I am not getting the filePath, I guess?
Is it necessary to "temporarily save" the camera's picture in order to .getData()? Or it can work without any kind of "saving"?



I just want the user to take a picture and it should be sent to Firebase. Not necessary for user to see it in imageView first, just to get the uri (data) in order to upload it.



view when i open the camera
view after taking a picture










share|improve this question















I am trying to take a picture with the camera of "Nexus 5X API 26" and show it in ImageView field, before uploading it to Firebase.



First problem is that after taking a picture, it does not show up in imageView. I am doing the following:



Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (intent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(intent, CAMERA_REQUEST_CODE);}


And then I was trying to show the picture in the same way I do for the pictures taken from Gallery:



filePath = data.getData();
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), filePath);
imageView.setImageBitmap(bitmap);
} catch (IOException e) {
e.printStackTrace();
}


The part which I do not understand is how data.getData() works in both cases (for Gallery and for Camera)?



I guess that the uploadImage() method, should be the same for both Gallery and Camera uploads (it works for Gallery already so...).



So the thing that I am currently missing is that I am not getting the filePath, I guess?
Is it necessary to "temporarily save" the camera's picture in order to .getData()? Or it can work without any kind of "saving"?



I just want the user to take a picture and it should be sent to Firebase. Not necessary for user to see it in imageView first, just to get the uri (data) in order to upload it.



view when i open the camera
view after taking a picture







android firebase firebase-realtime-database android-camera-intent






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 14 at 13:22

























asked Nov 9 at 15:19









Marko Otasevic

73




73












  • There's probably an Exception being thrown. Check what your catch block prints.
    – TheWanderer
    Nov 9 at 15:21










  • getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
    – Alex Cohn
    Nov 10 at 6:38










  • Thanks, the posts + further refinement worked out in the end :)
    – Marko Otasevic
    Nov 23 at 13:06


















  • There's probably an Exception being thrown. Check what your catch block prints.
    – TheWanderer
    Nov 9 at 15:21










  • getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
    – Alex Cohn
    Nov 10 at 6:38










  • Thanks, the posts + further refinement worked out in the end :)
    – Marko Otasevic
    Nov 23 at 13:06
















There's probably an Exception being thrown. Check what your catch block prints.
– TheWanderer
Nov 9 at 15:21




There's probably an Exception being thrown. Check what your catch block prints.
– TheWanderer
Nov 9 at 15:21












getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
– Alex Cohn
Nov 10 at 6:38




getData() works entirely differently for camera intent. Check stackoverflow.com/a/40715056/192373 and links around that. Usually, as in stackoverflow.com/questions/45046001/…, we pass EXTRA_OUTPUT to tell the camera app where to store the picture. Instead, you can get the last picture taken by camera in onActivityResult().
– Alex Cohn
Nov 10 at 6:38












Thanks, the posts + further refinement worked out in the end :)
– Marko Otasevic
Nov 23 at 13:06




Thanks, the posts + further refinement worked out in the end :)
– Marko Otasevic
Nov 23 at 13:06












1 Answer
1






active

oldest

votes

















up vote
0
down vote













This will help you.



Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,7);


@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {

super.onActivityResult(requestCode, resultCode, data);


bitmap= (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream baos=new ByteArrayOutputStream();

bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte b = baos.toByteArray();
imageEncoded = Base64.encodeToString(b,Base64.DEFAULT);

byte imageAsBytes = Base64.decode(imageEncoded.getBytes(), Base64.DEFAULT);
InputStream is=new ByteArrayInputStream(imageAsBytes);
bitmap1=BitmapFactory.decodeStream(is);
img.setImageBitmap(bitmap1);


}





share|improve this answer























  • first you test on small image and then on large image then send the screen shot of the problem
    – Vishal Sharma
    Nov 14 at 12:59











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53228490%2fandroid-camera-picture-not-showing-in-imageview%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













This will help you.



Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,7);


@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {

super.onActivityResult(requestCode, resultCode, data);


bitmap= (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream baos=new ByteArrayOutputStream();

bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte b = baos.toByteArray();
imageEncoded = Base64.encodeToString(b,Base64.DEFAULT);

byte imageAsBytes = Base64.decode(imageEncoded.getBytes(), Base64.DEFAULT);
InputStream is=new ByteArrayInputStream(imageAsBytes);
bitmap1=BitmapFactory.decodeStream(is);
img.setImageBitmap(bitmap1);


}





share|improve this answer























  • first you test on small image and then on large image then send the screen shot of the problem
    – Vishal Sharma
    Nov 14 at 12:59















up vote
0
down vote













This will help you.



Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,7);


@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {

super.onActivityResult(requestCode, resultCode, data);


bitmap= (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream baos=new ByteArrayOutputStream();

bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte b = baos.toByteArray();
imageEncoded = Base64.encodeToString(b,Base64.DEFAULT);

byte imageAsBytes = Base64.decode(imageEncoded.getBytes(), Base64.DEFAULT);
InputStream is=new ByteArrayInputStream(imageAsBytes);
bitmap1=BitmapFactory.decodeStream(is);
img.setImageBitmap(bitmap1);


}





share|improve this answer























  • first you test on small image and then on large image then send the screen shot of the problem
    – Vishal Sharma
    Nov 14 at 12:59













up vote
0
down vote










up vote
0
down vote









This will help you.



Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,7);


@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {

super.onActivityResult(requestCode, resultCode, data);


bitmap= (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream baos=new ByteArrayOutputStream();

bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte b = baos.toByteArray();
imageEncoded = Base64.encodeToString(b,Base64.DEFAULT);

byte imageAsBytes = Base64.decode(imageEncoded.getBytes(), Base64.DEFAULT);
InputStream is=new ByteArrayInputStream(imageAsBytes);
bitmap1=BitmapFactory.decodeStream(is);
img.setImageBitmap(bitmap1);


}





share|improve this answer














This will help you.



Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,7);


@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {

super.onActivityResult(requestCode, resultCode, data);


bitmap= (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream baos=new ByteArrayOutputStream();

bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte b = baos.toByteArray();
imageEncoded = Base64.encodeToString(b,Base64.DEFAULT);

byte imageAsBytes = Base64.decode(imageEncoded.getBytes(), Base64.DEFAULT);
InputStream is=new ByteArrayInputStream(imageAsBytes);
bitmap1=BitmapFactory.decodeStream(is);
img.setImageBitmap(bitmap1);


}






share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 10 at 15:47









Manhar

60112




60112










answered Nov 9 at 18:02









Vishal Sharma

7622212




7622212












  • first you test on small image and then on large image then send the screen shot of the problem
    – Vishal Sharma
    Nov 14 at 12:59


















  • first you test on small image and then on large image then send the screen shot of the problem
    – Vishal Sharma
    Nov 14 at 12:59
















first you test on small image and then on large image then send the screen shot of the problem
– Vishal Sharma
Nov 14 at 12:59




first you test on small image and then on large image then send the screen shot of the problem
– Vishal Sharma
Nov 14 at 12:59


















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53228490%2fandroid-camera-picture-not-showing-in-imageview%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Schultheiß

Liste der Kulturdenkmale in Wilsdruff

Android Play Services Check