I am sending a screenshot from java server hosted on a machine. The screenshot would be send to an android device through sockets in form of a byte stream array. But the array which I am getting in the android device is not getting converted to bitmap. Below I am attaching the java server code, android code and the logcat.
Here is the java server code for sending the screenshot captured.
socket2 = serverSocket2.accept();
System.out.println("A client has connected");
Robot robot = new Robot();
String format = "jpg";
String fileName = "FullScreenshot." + format;
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage screenFullImage = robot.createScreenCapture(screenRect); // captured image
ImageIO.write(screenFullImage, format, new File("C:\\Users\\User\\Desktop\\"+fileName));
ByteArrayOutputStream bao=new ByteArrayOutputStream();
ImageIO.write(screenFullImage,format,bao);//ye
byte[] ar=bao.toByteArray();
System.out.println("Parental block is executed");
mainWriter = new BufferedWriter(new OutputStreamWriter(socket2.getOutputStream()));
mainWriter.write(java.util.Arrays.toString(ar));
mainWriter.newLine();
mainWriter.flush();
System.out.println("A full screenshot saved!");
serverSocket2.close();
socket2.close();
mainWriter.close();
Testing t = new Testing();
Here is the android code where I am getting the byte stream array.
public void PCConnection(final View view) // just for this activity
{
new Thread()
{
public Socket socket;
public void run()
{
try
{
Log.i(DebuggString,"Attempting to connect to the server");
socket = new Socket(hostname,60120);
Log.i(DebuggString,"Connection established");
mivScreenShot = (ImageView) findViewById(R.id.ivScreenShot);
//Receive message from the server
//Message is stored in the br.readLine()
brr = new BufferedReader(new InputStreamReader(socket.getInputStream()));
img = brr.readLine();
Log.d("Image", img);//yeh byte[] display karta h
final ByteArrayInputStream arrayInputStream = new ByteArrayInputStream(img.getBytes());
arrayInputStream.reset();
this.socket.close();
runOnUiThread(new Runnable() {
#Override
public void run() {
Glide.with(ParentalControl.this)
.load(bitmap)
.asBitmap()
.into(mivScreenShot);
}
});
if(bitmap!=null)
{
Log.d(DebuggString,"Bitmap is not null "); // oh ok koi nai
}
else
{
Log.d(DebuggString,"Bitmap is null");
}
}
catch (IOException e )
{
e.printStackTrace();
}
}
}.start();
}
Here I am attaching the logcat screenshot url: https://i.imgur.com/167Vje3.png
mainWriter = new BufferedWriter(new OutputStreamWriter(socket2.getOutputStream()));
mainWriter.write(java.util.Arrays.toString(ar));
You cannot use writers and strings to send a jpg image as those are for text only. Do away with them. Do away with the ByteOutputStream too as you can just directly compress the bitmap to the outputstream of the socket.
Further check how many bytes you send and how many are received.
That was for the server. Also on the receiving side you cannot use readers and strings.
Related
I am needing help transferring a PNG image via TCP from my raspberry pi (python) to my android application (java). I have spent almost two weeks trying to understand and solve this problem, so any help would be greatly appreciated.
I have set up a client-server architecture such that my raspberry pi 3 records audio, performs some analysis on it, and then sends the data (via TCP) to the android app to display on the app screen. The recording and analysis is done and I am able to make the connection and transfer string data that displays on the app with no problem. However, I have been unsuccessful in transferring an image from rpi to android app. So basically, the image is stored on the rpi and I an attempting to transfer the image to the app to display it.
Current Implementation:
On rpi (python): Like I said, sending strings and displaying them on the android app is done without any problem. When I am sending the image portion of the audio analysis, I send a string first that says "?start" so that the android side knows that an image instead of a string is about to be sent (and will wait to update the GUI until it receives the entire image). Then, I open the image stored on rpi and read the entire image as a byte array (typically about 40-50k bytes). I get the length of the byte array and send that as a string to android app. Finally, I send the byte array to the android and it waits for an OK message from the app. All of this works without reporting any errors.
On android app (java): When the app receives the "?start" string, it then uses a Buffered Reader (which is what I used to read the string data I had transferred to the app successfully earlier) to read the size of the image byte array. Then, I create a buffer, msg_buff, to read in at most 1024 bytes at a time while baos will hold the entire byte array of the image. In the infinite while loop, I have a DataInputStream, called in, read bytes into msg_buff and returns the number of bytes read. Then, I add the contents of msg_buff into baos. Once the bytes read from in is -1 or the img_offset (which is just the total number of bytes read) is greater than or equal to the size of the image bytes array, the while loop is broken. Then, I would attempt to save the image to android internal storage and then load it later to an ImageView to display it. This code does successfully read in the bytes until there are around 2000-3000 bytes left to be read and then it seems to freeze on the int bytes_read = in.read(msg_buff, 0, byte_size) line. I have not been able to get past that point so I do not know if saving the image to internal storage and then loading it to ImageView that way will work either. I believe it is freezing on this line because some bytes are being lost or not sent from python to java. Does anyone know how I can resolve this?
The code that reads the image data from the python server is in the run() method.
TCPClient.java
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.util.Log;
import java.io.*;
import java.net.InetAddress;
import java.net.Socket;
public class TcpClient {
public static final String SERVER_IP = myIPAddress; //your computer IP address
public static final int SERVER_PORT = myPortNumber;
// message to send to the server
private String mServerMessage;
// sends message received notifications
private OnMessageReceived mMessageListener = null;
// while this is true, the server will continue running
private boolean mRun = false;
// used to send messages
private PrintWriter mBufferOut;
// used to read messages from the server
private BufferedReader mBufferIn;
/**
* Constructor of the class. OnMessagedReceived listens for the messages received from server
*/
public TcpClient(OnMessageReceived listener) {
mMessageListener = listener;
}
/**
* Sends the message entered by client to the server
*
* #param message text entered by client
*/
public void sendMessage(String message) {
if (mBufferOut != null && !mBufferOut.checkError()) {
mBufferOut.println(message);
mBufferOut.flush();
}
}
/**
* Close the connection and release the members
*/
public void stopClient() {
Log.i("Debug", "stopClient");
mRun = false;
if (mBufferOut != null) {
mBufferOut.flush();
mBufferOut.close();
}
mMessageListener = null;
mBufferIn = null;
mBufferOut = null;
mServerMessage = null;
}
public void run() {
mRun = true;
try {
//here you must put your computer's IP address.
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
Log.e("TCP Client", "C: Connecting...");
//create a socket to make the connection with the server
Socket socket = new Socket(serverAddr, SERVER_PORT);
try {
InputStream sin = socket.getInputStream();
OutputStream sout = socket.getOutputStream();
DataInputStream in = new DataInputStream(sin);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mBufferIn = new BufferedReader(new InputStreamReader(socket.getInputStream()));
//in this while the client listens for the messages sent by the server
while (mRun) {
mServerMessage = mBufferIn.readLine();
if (mServerMessage != null && mMessageListener != null) {
//Check if data is image
if(mServerMessage.equals("?start"))
{
mServerMessage = mBufferIn.readLine();
String fileName = "";
if(mServerMessage.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(mServerMessage.equals("spec"))
{
fileName = "spec.jpeg";
}
// Get length of image byte array
int size = Integer.parseInt(mBufferIn.readLine());
Log.i("Debug:", "image message size: "+size);
// Create buffers
byte[] msg_buff = new byte[1024];
//byte[] img_buff = new byte[size];
int img_offset = 0;
while(true){
int byte_size = msg_buff.length;
int bytes_read = in.read(msg_buff, 0, byte_size);
Log.i("Debug:", "image message bytes:" + bytes_read);
if(bytes_read == -1){
break;
}
//copy bytes into img_buff
//System.arraycopy(msg_buff, 0, img_buff, img_offset, bytes_read);
baos.write(msg_buff, 0, bytes_read);
img_offset += bytes_read;
Log.i("Debug:", "image message bytes read:"+img_offset);
if( img_offset >= size)
{
break;
}
}
try{
byte[] data = baos.toByteArray();
ByteArrayInputStream bais = new ByteArrayInputStream(data);
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File mypath = new File(directory, fileName);
//Bitmap bitmap = BitmapFactory.decodeByteArray(img_buff, 0, img_buff.length);
Bitmap bitmap = BitmapFactory.decodeStream(bais);
FileOutputStream fos = new FileOutputStream(mypath);
//Use compress method on Bitmap object to write image to OutputStream
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
fos.flush();
fos.close();
//Send OK
byte[] OK = new byte[] {0x4F, 0x4B};
sout.write(OK);
} catch (Exception e) {
Log.i("Debug:", "image message" +e);
e.printStackTrace();
}
}
//call the method messageReceived from MyActivity class
mMessageListener.messageReceived(mServerMessage);
}
}
Log.e("RESPONSE FROM SERVER", "S: Received Message: '" + mServerMessage + "'");
} catch (Exception e) {
Log.e("TCP", "S: Error", e);
} finally {
//the socket must be closed. It is not possible to reconnect to this socket
// after it is closed, which means a new socket instance has to be created.
socket.close();
}
} catch (Exception e) {
Log.e("TCP", "C: Error", e);
}
}
//Declare the interface. The method messageReceived(String message) must be implemented in the MainActivity
//class in asynckTask doInBackground
public interface OnMessageReceived {
void messageReceived(String message);
}
}
MainActivity.java:
import android.app.Application;
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.drawable.Drawable;
import android.os.AsyncTask;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.*;
import org.apache.commons.codec.binary.Base64;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
public class MainActivity extends AppCompatActivity {
private TcpClient mTcpClient;
private TextView dbView;
private TextView roomView;
private TextView classView;
private TextView statusView;
private TextView timeView;
private ImageView signalView;
private ImageView specView;
private Button getAnalysis;
private Button disconnect;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
getAnalysis = findViewById(R.id.get_analysis);
dbView = findViewById(R.id.db_level);
roomView = findViewById(R.id.RoomsValues);
classView = findViewById(R.id.ClassValues);
timeView = findViewById(R.id.timeStamp);
signalView = findViewById(R.id.audioPic);
specView = findViewById(R.id.specPic);
statusView = findViewById(R.id.status);
disconnect = findViewById(R.id.disconnect);
getAnalysis.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v)
{
statusView.setText("Connecting to Auris...\nRoom analytics will arrive shortly.");
new ConnectTask().execute("");
}
});
disconnect.setOnClickListener(new View.OnClickListener(){
#Override
public void onClick(View v)
{
mTcpClient.stopClient();
statusView.setText("Disconnected from Auris.\nReconnect to receive room analysis updates.");
}
});
}
public class ConnectTask extends AsyncTask<String, String, TcpClient> {
#Override
protected TcpClient doInBackground(String... message) {
//we create a TCPClient object and
mTcpClient = new TcpClient(new TcpClient.OnMessageReceived() {
#Override
//here the messageReceived method is implemented
public void messageReceived(String message) {
//this method calls the onProgressUpdate
publishProgress(message);
Log.i("Debug","Input message: " + message);
}
});
//statusView.setText("Get analysis from Auris as it is collected.");
mTcpClient.run();
return null;
}
#Override
protected void onProgressUpdate(String... values) {
super.onProgressUpdate(values);
//Store string of values sent from Auris device
String str = values[0];
//if data starts with +, then it is the string data
if(str.startsWith("+"))
{
//Split values around spaces
/*
Values in data indices
0-8 are room log likelihoods
9-12 are class log likelihoods
13 is dbA level
14 is room model best matched
15 is class model best matched
*/
// Remove +
str = str.substring(1);
String data[]= str.split(" ");
String roomData = "";
String classData = "";
String status;
for(int i = 0; i < 9; i++)
{
roomData = roomData.concat(data[i]);
roomData = roomData.concat("\n");
}
roomView.setText(roomData);
for(int i = 9; i < 13; i++)
{
classData = classData.concat(data[i]);
classData = classData.concat("\n");
}
classView.setText(classData);
dbView.setText(data[13]);
status = "The room most closely matches " + data[14] + " room model & " + data[15] + " class model.";
statusView.setText(status);
}
else if (str.startsWith("TIME"))
{
// Remove "TIME"
str.substring(4);
String message = "This room profile represents the room at " + str + ".";
timeView.setText(message);
}
else
{
try {
String fileName = "";
if(str.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(str.equals("spec"))
{
fileName = "spec.jpeg";
}
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File file = new File(directory, fileName);
Bitmap bitmap = BitmapFactory.decodeStream(new FileInputStream(file));
signalView.setImageBitmap(bitmap);
} catch (FileNotFoundException e){
e.printStackTrace();
}
}
Log.i("onProgressUpdate",values[0]);
}
}
}
Python Code to send Image Data:
def send_image_to_byte_array(image_file, conn, label):
with open(image_file, "rb") as imageFile:
content = imageFile.read()
conn.sendall("?start\n".encode('utf-8'))
conn.sendall(label.encode('utf-8'))
size = len(content)
strSize = str(size) + "\n"
conn.sendall(strSize.encode('utf-8'))
conn.sendall(content)
From what I can tell, not all of the bytes of the image are successfully being sent from the rpi to the android app. There is data loss which causes the android app to hang on the int bytes_read = in.read(msg_buff, 0, byte_size); line in the run() method of TCPClient.java. From reading different posts, it seems that using struct.unpack/pack fixes this problem when transferring an image from python to python, but I do not know how to implement struct.unpack in java or if I can just use some input stream. I am also not sure what would be the best approach to using struct.pack in python. Any help is greatly appreciated!
EDIT:
I believe the problem is the endianess. From what I have read, raspberry pi is little endian and java is big endian. So, when I read the image that was saved to raspberry pi and try to transmit it to java from python, these issues are occurring. Does anyone know how I can change the endianness of java from big to little or some other way to fix this problem?
The issue is caused by the BufferedReader reading in extra data off the line (in order to fill its internal buffer), which makes that data unavailable from in.read().
As you can see from an sample Android BufferedReader implementation, a call to readLine() causes the BufferedReader to attempt to fill its internal buffer. It will do this using any available bytes on its source InputStream. up to a limit of 8192 chars. And, if the BufferedReader has read those bytes, they won't be there when you try to get them from in.read(). This throws off your entire size-counting system, meaning you eventually end up blocking in in.read(), because you didn't read all the data you expected to.
The most expedient solution is probably to implement your own version of readLine(), that assembles strings, one byte at a time, until it hits a '\n'. After all, the only reason you needed a BufferedReader was for the readLine() function.
I'm writing a trading program that allows users to upload their own images for other clients to view when looking at their posted listing. The image file (.png or .jpeg file) is sent from the server when requested.
It seems that when using a FileInputStream as the parameter for a new Image, the client attempting to open the image is looking for that file on their own computer (checking it's own file system) instead of directly reading the image file itself that has been sent to it from the server.
In the example I am posting, let's assume that this is ran in the IDE and not as a JAR. The server program who is sending the file is able to access "avatar.png" successfully from its 'src' directory.
I have built a minimal executable example below, consisting of a server and client to represent the realistic issue at hand. The server and client examples were ran on separate computers on a local network. The example replicates the issue.
From the FileInputStream documentation:
FileInputStream is meant for reading streams of raw bytes such as image data.
The example throws the following exception:
java.io.FileNotFoundException: avatar.png (No such file or directory)
Which shows that the client is looking for 'avatar.png' on it's own file system.
For example, the server:
public class ImageTesterServer {
public static void main(String[] args) {
ImageTesterServer server = new ImageTesterServer();
server.sendImage();
}
private void sendImage()
{
ServerSocket server = null;
Socket client = null;
try {
// Accept client connection, create new File from the 'avatar.png' image, and send to client
server = new ServerSocket();
System.out.println("Awaiting client connection...");
client = server.accept();
System.out.println("Client connected.");
File imageFile = new File("avatar.png");
ObjectOutputStream oos = new ObjectOutputStream(client.getOutputStream());
oos.writeObject(imageFile);
System.out.println("Sent image file.");
} catch (IOException e) {
e.printStackTrace();
}
finally { // Close sockets
try {
if (client != null)
client.close();
if (server != null)
server.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
The client:
public class ImageTesterClient extends Application {
public static void main(String[] args)
{
Application.launch(args);
}
#Override
public void start(Stage primaryStage)
{
ImageTesterClient client = new ImageTesterClient();
Image avatar = client.getServerFile(); // Retrieve the image's file from the server
ImageView picture = new ImageView(avatar);
StackPane root = new StackPane();
root.getChildren().add(picture);
Scene scene = new Scene(root);
primaryStage.setScene(scene);
primaryStage.show();
}
private Image getServerFile()
{
Socket socket = null;
ObjectInputStream ois;
try {
socket = new Socket("192.168.1.147", 5000); // Open new socket connection to server on another local network computer
ois = new ObjectInputStream(socket.getInputStream());
File imageFile = (File)ois.readObject();
Image avatar = new Image(new FileInputStream(imageFile));
return avatar;
} catch (IOException | ClassNotFoundException ex) {
ex.printStackTrace();
}
finally { // Close the socket if not null
try {
if (socket != null)
socket.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
return null;
}
}
How do I force the client Image to use the File (avatar.png) itself that has been received from the server, instead of it attempting to look on it's own file system?
The File class does not represent a file’s contents. It is merely a representation of a file path. This confusion is one of many reasons the File class is considered obsolete (though not deprecated), and has been replaced by the Path class.
So, when you do oos.writeObject(imageFile);, you are not sending the contents of the file, only the path of that file. Obviously, the existence of a file path on one computer does not guarantee that the same path is valid on another computer.
You will have to send the file’s content separately. One way to do this is to open a FileInputStream, wrap it in a BufferedInputStream, and use that BufferInputStream’s transferTo method:
oos.writeObject(imageFile);
try (InputStream stream =
new BufferedInputStream(new FileInputStream(imageFile))) {
stream.transferTo(oos);
}
And on the client side, use the File’s length to determine how many bytes to read:
File imageFile = (File) ois.readObject();
long length = imageFile.length();
Path imagePath = Files.createTempFile(null, imageFile.getName());
try (FileChannel imageChannel = FileChannel.open(imagePath,
StandardOpenOption.WRITE, StandardOpenOption.TRUNCATE_EXISTING)) {
imageChannel.transferFrom(Channels.newChannel(ois), 0, length);
}
Image avatar = new Image(imagePath.toUri().toString());
I created a C# server that sends an Bitmap through a socket to android client. That Bitmap is constantly updating because it's a video feed.
Server C#
private void send_data() {
ImageConverter converter = new ImageConverter();
byte[] sendBytes = (byte[]) converter.ConvertTo(master.picturebox_master.Image, typeof(byte[]));
string_master_frame = System.Text.Encoding.UTF8.GetString(sendBytes);
string_master_frame = Convert.ToBase64String(sendBytes);
data = string_master_frame + "\n";
tcpServer1.Send(data);
}
Client Android
#Override
protected Void doInBackground(Void... arg0) {
Socket socket = null;
try {
socket = new Socket(dstAddress, dstPort);
Scanner r = new Scanner(new InputStreamReader(socket.getInputStream()));
while (true) {
valores[26] = r.nextLine();
publishProgress(valores[26]);
}
return null;
}
#Override
protected void onProgressUpdate(String... values) {
byte[] decodedString = Base64.decode(values[26], Base64.NO_WRAP);
Bitmap master_bitmap = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length);
master_frame.setImageBitmap(master_bitmap);
}
The first frame is sent and the Android client displays it correctly. But when the next frame comes, the Android Client Crashes.
Error:
Process: com.example.tiago.java_android, PID: 826
java.lang.IllegalArgumentException: bad base-64
at android.util.Base64.decode(Base64.java:161)
at android.util.Base64.decode(Base64.java:136)
at android.util.Base64.decode(Base64.java:118)
at com.example.tiago.java_android.Cliente.onProgressUpdate(Cliente.java:228)
at com.example.tiago.java_android.Cliente.onProgressUpdate(Cliente.java:28)
at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:656)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5431)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:914)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:707)
I though the valores[26] data was being corrupted but it's not. I get the data correctly.
I used http://codebeautify.org/base64-to-image-converter to check if I get the data correctly.
Any idea?
PS: I lost my account so I had to make this one.
Bitmap master_bitmap;
byte[] decodedString = Base64.decode(values[26],Base64.NO_WRAP);
master_bitmap = BitmapFactory.decodeByteArray(decodedString,0,decodedString.length);
if(master_bitmap != null)
{
try {
master_frame.setImageBitmap(master_bitmap);
}
catch (IllegalArgumentException e)
{
e.printStackTrace();
}
}
master_bitmap.recycle();
}
Didnt work. Error:
java.lang.RuntimeException: Canvas: trying to use a recycled bitmap
Did you try with try/catch block I explained previously?
Recall:
I had the same problem and I am almost sure this will solve your problem:
String[] safe = values[26].split("=");
byte[] decodedString = Base64.decode(safe[0],Base64.NO_WRAP);
Bitmap master_bitmap = BitmapFactory.decodeByteArray(decodedString,0,decodedString.length);
master_frame.setImageBitmap(master_bitmap);
master_bitmap.recycle(); //THIS LINE WAS ENOUGH TO FIX MY CODE
Also, don't forget to use try/catch blocks effectively.
try{ //code here}
catch(IllegalArgumentException e){ //code here if you want}
(Add more catch statements if necessary.)
My full code for further help:
byte[] bytearray = Base64.decode(dataIn);
Bitmap myBitmap = BitmapFactory.decodeByteArray(bytearray, 0,bytearray.length);
if (myBitmap!=null) {
//some irrelevant code here to turn bitmap to a PImage (a Processing image class)
}
myBitmap.recycle();
This works perfectly for me.
Try also this:
try {
Bitmap master_bitmap;
master_bitmap=BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length);
if (master_bitmap != null) { //might be an unnecessary if condidition
try {
if (((BitmapDrawable)master_frame.getDrawable()).getBitmap()!=null) {
((BitmapDrawable)master_frame.getDrawable()).getBitmap().recycle();
}
master_frame.setImageBitmap(master_bitmap);
//maybe try here: master_bitmap.recycle();
}
catch(RuntimeException e) {
e.printStackTrace();
}
}
}
catch (IllegalArgumentException e) {
e.printStackTrace();
}
i am creating a desktop client server application, in which i am capturing the frames, in jpg images, rendered by the renderer and storing them on the client side.
now i need to upload the images to the server.
i tried this by
placing a separate thread for every captured image to upload it directly to the server but it was very time consuming. Also i tried to upload all the images from the client after the capturing is stopped, but that is not the case i want.
SO is there a way to upload the directly captured images to server effectievely.
for capturing images i am using BufferedImage and ImageIO.write methods
Thanks in advance
Image uploading Over socket is the fastest way to upload image on server because data will be passed to server as byte stream.
Below is simple Socket Client and Socket Sever to achieve Image upload
Client
public class ImageUploadSocketClient {
public static void main(String[] args) throws Exception {
Socket socket = new Socket("localhost",6666);
OutputStream outputStream = socket.getOutputStream();
BufferedImage image = ImageIO.read(new File("path to image /your_image.jpg"));
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(image, "jpg", byteArrayOutputStream);
byte[] size = ByteBuffer.allocate(4).putInt(byteArrayOutputStream.size()).array();
outputStream.write(size);
outputStream.write(byteArrayOutputStream.toByteArray());
outputStream.flush();
socket.close();
}
}
Server
public class ImageUploadSocketRunnable implements Runnable{
public static final String dir="path to store image";
Socket soc=null;
ImageUploadSocketRunnable(Socket soc){
this.soc=soc;
}
#Override
public void run() {
InputStream inputStream = null;
try {
inputStream = this.soc.getInputStream();
System.out.println("Reading: " + System.currentTimeMillis());
byte[] sizeAr = new byte[4];
inputStream.read(sizeAr);
int size = ByteBuffer.wrap(sizeAr).asIntBuffer().get();
byte[] imageAr = new byte[size];
inputStream.read(imageAr);
BufferedImage image = ImageIO.read(new ByteArrayInputStream(imageAr));
System.out.println("Received " + image.getHeight() + "x" + image.getWidth() + ": " + System.currentTimeMillis());
ImageIO.write(image, "jpg", new File(dir+System.currentTimeMillis()+".jpg"));
inputStream.close();
} catch (IOException ex) {
Logger.getLogger(ImageUploadSocketRunnable.class.getName()).log(Level.SEVERE, null, ex);
}
}
public static void main(String[] args) throws Exception {
ServerSocket serverSocket = new ServerSocket(13085);
while(true){
Socket socket = serverSocket.accept();
ImageUploadSocketRunnable imgUploadServer=new ImageUploadSocketRunnable(socket);
Thread thread=new Thread(imgUploadServer);
thread.start();
}
}
}
On server You should create different thread for different client socket in this way you can achieve concurrent image upload from different clients.
Hope above example will help you.
i have a program which accepts connection from a phone that sends a byte array, i am able to test the connection when it is made , however how do i know that i am actually receiving something? how can i "see" if any thing is being sent over the socket. Because from my code below i am not able to have the resulting file "saved.jpg" created. Does this mean that it did not receive anything?
public class wpsServer {
//vars
private int svrPort = 3334;
private ServerSocket serverSocket;
private Image image = null;
public wpsServer()
{
try {
serverSocket = new ServerSocket(svrPort);
System.out.println("Server started on "+svrPort);
}
catch (IOException e) {
System.out.println("Could not listen on port: "+svrPort);
System.exit(-1);
}
}
public void listenForClient()
{
Socket clientSocket = null;
try {
clientSocket = serverSocket.accept();
if(clientSocket.isConnected())
System.out.println("Connected");
byte[] pic = getPicture(clientSocket.getInputStream());
InputStream in = new ByteArrayInputStream(pic);
BufferedImage image = ImageIO.read(in);
File outputfile = new File("saved.jpg");
ImageIO.write(image, "jpg", outputfile);
}
catch (IOException e) {
System.out.println("Accept failed: "+svrPort);
System.exit(-1);
}
}
public byte[] getPicture(InputStream in) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] data = new byte[1024];
int length = 0;
while ((length = in.read(data))!=-1) {
out.write(data,0,length);
}
return out.toByteArray();
} catch(IOException ioe) {
//handle it
}
return null;
}
}
The in.read call will only return -1 if the other end closes the socket. While the socket is alive, that call will block until more data is available.
What you need to do is change your "protocol": the client should send the array size first, then the data. The server should read that length, and stop reading the file when that's done (go back to waiting for the next file for instance).