I am needing help transferring a PNG image via TCP from my raspberry pi (python) to my android application (java). I have spent almost two weeks trying to understand and solve this problem, so any help would be greatly appreciated.
I have set up a client-server architecture such that my raspberry pi 3 records audio, performs some analysis on it, and then sends the data (via TCP) to the android app to display on the app screen. The recording and analysis is done and I am able to make the connection and transfer string data that displays on the app with no problem. However, I have been unsuccessful in transferring an image from rpi to android app. So basically, the image is stored on the rpi and I an attempting to transfer the image to the app to display it.
Current Implementation:
On rpi (python): Like I said, sending strings and displaying them on the android app is done without any problem. When I am sending the image portion of the audio analysis, I send a string first that says "?start" so that the android side knows that an image instead of a string is about to be sent (and will wait to update the GUI until it receives the entire image). Then, I open the image stored on rpi and read the entire image as a byte array (typically about 40-50k bytes). I get the length of the byte array and send that as a string to android app. Finally, I send the byte array to the android and it waits for an OK message from the app. All of this works without reporting any errors.
On android app (java): When the app receives the "?start" string, it then uses a Buffered Reader (which is what I used to read the string data I had transferred to the app successfully earlier) to read the size of the image byte array. Then, I create a buffer, msg_buff, to read in at most 1024 bytes at a time while baos will hold the entire byte array of the image. In the infinite while loop, I have a DataInputStream, called in, read bytes into msg_buff and returns the number of bytes read. Then, I add the contents of msg_buff into baos. Once the bytes read from in is -1 or the img_offset (which is just the total number of bytes read) is greater than or equal to the size of the image bytes array, the while loop is broken. Then, I would attempt to save the image to android internal storage and then load it later to an ImageView to display it. This code does successfully read in the bytes until there are around 2000-3000 bytes left to be read and then it seems to freeze on the int bytes_read = in.read(msg_buff, 0, byte_size) line. I have not been able to get past that point so I do not know if saving the image to internal storage and then loading it to ImageView that way will work either. I believe it is freezing on this line because some bytes are being lost or not sent from python to java. Does anyone know how I can resolve this?
The code that reads the image data from the python server is in the run() method.
TCPClient.java
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.util.Log;
import java.io.*;
import java.net.InetAddress;
import java.net.Socket;
public class TcpClient {
public static final String SERVER_IP = myIPAddress; //your computer IP address
public static final int SERVER_PORT = myPortNumber;
// message to send to the server
private String mServerMessage;
// sends message received notifications
private OnMessageReceived mMessageListener = null;
// while this is true, the server will continue running
private boolean mRun = false;
// used to send messages
private PrintWriter mBufferOut;
// used to read messages from the server
private BufferedReader mBufferIn;
/**
* Constructor of the class. OnMessagedReceived listens for the messages received from server
*/
public TcpClient(OnMessageReceived listener) {
mMessageListener = listener;
}
/**
* Sends the message entered by client to the server
*
* #param message text entered by client
*/
public void sendMessage(String message) {
if (mBufferOut != null && !mBufferOut.checkError()) {
mBufferOut.println(message);
mBufferOut.flush();
}
}
/**
* Close the connection and release the members
*/
public void stopClient() {
Log.i("Debug", "stopClient");
mRun = false;
if (mBufferOut != null) {
mBufferOut.flush();
mBufferOut.close();
}
mMessageListener = null;
mBufferIn = null;
mBufferOut = null;
mServerMessage = null;
}
public void run() {
mRun = true;
try {
//here you must put your computer's IP address.
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
Log.e("TCP Client", "C: Connecting...");
//create a socket to make the connection with the server
Socket socket = new Socket(serverAddr, SERVER_PORT);
try {
InputStream sin = socket.getInputStream();
OutputStream sout = socket.getOutputStream();
DataInputStream in = new DataInputStream(sin);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mBufferIn = new BufferedReader(new InputStreamReader(socket.getInputStream()));
//in this while the client listens for the messages sent by the server
while (mRun) {
mServerMessage = mBufferIn.readLine();
if (mServerMessage != null && mMessageListener != null) {
//Check if data is image
if(mServerMessage.equals("?start"))
{
mServerMessage = mBufferIn.readLine();
String fileName = "";
if(mServerMessage.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(mServerMessage.equals("spec"))
{
fileName = "spec.jpeg";
}
// Get length of image byte array
int size = Integer.parseInt(mBufferIn.readLine());
Log.i("Debug:", "image message size: "+size);
// Create buffers
byte[] msg_buff = new byte[1024];
//byte[] img_buff = new byte[size];
int img_offset = 0;
while(true){
int byte_size = msg_buff.length;
int bytes_read = in.read(msg_buff, 0, byte_size);
Log.i("Debug:", "image message bytes:" + bytes_read);
if(bytes_read == -1){
break;
}
//copy bytes into img_buff
//System.arraycopy(msg_buff, 0, img_buff, img_offset, bytes_read);
baos.write(msg_buff, 0, bytes_read);
img_offset += bytes_read;
Log.i("Debug:", "image message bytes read:"+img_offset);
if( img_offset >= size)
{
break;
}
}
try{
byte[] data = baos.toByteArray();
ByteArrayInputStream bais = new ByteArrayInputStream(data);
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File mypath = new File(directory, fileName);
//Bitmap bitmap = BitmapFactory.decodeByteArray(img_buff, 0, img_buff.length);
Bitmap bitmap = BitmapFactory.decodeStream(bais);
FileOutputStream fos = new FileOutputStream(mypath);
//Use compress method on Bitmap object to write image to OutputStream
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
fos.flush();
fos.close();
//Send OK
byte[] OK = new byte[] {0x4F, 0x4B};
sout.write(OK);
} catch (Exception e) {
Log.i("Debug:", "image message" +e);
e.printStackTrace();
}
}
//call the method messageReceived from MyActivity class
mMessageListener.messageReceived(mServerMessage);
}
}
Log.e("RESPONSE FROM SERVER", "S: Received Message: '" + mServerMessage + "'");
} catch (Exception e) {
Log.e("TCP", "S: Error", e);
} finally {
//the socket must be closed. It is not possible to reconnect to this socket
// after it is closed, which means a new socket instance has to be created.
socket.close();
}
} catch (Exception e) {
Log.e("TCP", "C: Error", e);
}
}
//Declare the interface. The method messageReceived(String message) must be implemented in the MainActivity
//class in asynckTask doInBackground
public interface OnMessageReceived {
void messageReceived(String message);
}
}
MainActivity.java:
import android.app.Application;
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.drawable.Drawable;
import android.os.AsyncTask;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.*;
import org.apache.commons.codec.binary.Base64;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
public class MainActivity extends AppCompatActivity {
private TcpClient mTcpClient;
private TextView dbView;
private TextView roomView;
private TextView classView;
private TextView statusView;
private TextView timeView;
private ImageView signalView;
private ImageView specView;
private Button getAnalysis;
private Button disconnect;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
getAnalysis = findViewById(R.id.get_analysis);
dbView = findViewById(R.id.db_level);
roomView = findViewById(R.id.RoomsValues);
classView = findViewById(R.id.ClassValues);
timeView = findViewById(R.id.timeStamp);
signalView = findViewById(R.id.audioPic);
specView = findViewById(R.id.specPic);
statusView = findViewById(R.id.status);
disconnect = findViewById(R.id.disconnect);
getAnalysis.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v)
{
statusView.setText("Connecting to Auris...\nRoom analytics will arrive shortly.");
new ConnectTask().execute("");
}
});
disconnect.setOnClickListener(new View.OnClickListener(){
#Override
public void onClick(View v)
{
mTcpClient.stopClient();
statusView.setText("Disconnected from Auris.\nReconnect to receive room analysis updates.");
}
});
}
public class ConnectTask extends AsyncTask<String, String, TcpClient> {
#Override
protected TcpClient doInBackground(String... message) {
//we create a TCPClient object and
mTcpClient = new TcpClient(new TcpClient.OnMessageReceived() {
#Override
//here the messageReceived method is implemented
public void messageReceived(String message) {
//this method calls the onProgressUpdate
publishProgress(message);
Log.i("Debug","Input message: " + message);
}
});
//statusView.setText("Get analysis from Auris as it is collected.");
mTcpClient.run();
return null;
}
#Override
protected void onProgressUpdate(String... values) {
super.onProgressUpdate(values);
//Store string of values sent from Auris device
String str = values[0];
//if data starts with +, then it is the string data
if(str.startsWith("+"))
{
//Split values around spaces
/*
Values in data indices
0-8 are room log likelihoods
9-12 are class log likelihoods
13 is dbA level
14 is room model best matched
15 is class model best matched
*/
// Remove +
str = str.substring(1);
String data[]= str.split(" ");
String roomData = "";
String classData = "";
String status;
for(int i = 0; i < 9; i++)
{
roomData = roomData.concat(data[i]);
roomData = roomData.concat("\n");
}
roomView.setText(roomData);
for(int i = 9; i < 13; i++)
{
classData = classData.concat(data[i]);
classData = classData.concat("\n");
}
classView.setText(classData);
dbView.setText(data[13]);
status = "The room most closely matches " + data[14] + " room model & " + data[15] + " class model.";
statusView.setText(status);
}
else if (str.startsWith("TIME"))
{
// Remove "TIME"
str.substring(4);
String message = "This room profile represents the room at " + str + ".";
timeView.setText(message);
}
else
{
try {
String fileName = "";
if(str.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(str.equals("spec"))
{
fileName = "spec.jpeg";
}
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File file = new File(directory, fileName);
Bitmap bitmap = BitmapFactory.decodeStream(new FileInputStream(file));
signalView.setImageBitmap(bitmap);
} catch (FileNotFoundException e){
e.printStackTrace();
}
}
Log.i("onProgressUpdate",values[0]);
}
}
}
Python Code to send Image Data:
def send_image_to_byte_array(image_file, conn, label):
with open(image_file, "rb") as imageFile:
content = imageFile.read()
conn.sendall("?start\n".encode('utf-8'))
conn.sendall(label.encode('utf-8'))
size = len(content)
strSize = str(size) + "\n"
conn.sendall(strSize.encode('utf-8'))
conn.sendall(content)
From what I can tell, not all of the bytes of the image are successfully being sent from the rpi to the android app. There is data loss which causes the android app to hang on the int bytes_read = in.read(msg_buff, 0, byte_size); line in the run() method of TCPClient.java. From reading different posts, it seems that using struct.unpack/pack fixes this problem when transferring an image from python to python, but I do not know how to implement struct.unpack in java or if I can just use some input stream. I am also not sure what would be the best approach to using struct.pack in python. Any help is greatly appreciated!
EDIT:
I believe the problem is the endianess. From what I have read, raspberry pi is little endian and java is big endian. So, when I read the image that was saved to raspberry pi and try to transmit it to java from python, these issues are occurring. Does anyone know how I can change the endianness of java from big to little or some other way to fix this problem?
The issue is caused by the BufferedReader reading in extra data off the line (in order to fill its internal buffer), which makes that data unavailable from in.read().
As you can see from an sample Android BufferedReader implementation, a call to readLine() causes the BufferedReader to attempt to fill its internal buffer. It will do this using any available bytes on its source InputStream. up to a limit of 8192 chars. And, if the BufferedReader has read those bytes, they won't be there when you try to get them from in.read(). This throws off your entire size-counting system, meaning you eventually end up blocking in in.read(), because you didn't read all the data you expected to.
The most expedient solution is probably to implement your own version of readLine(), that assembles strings, one byte at a time, until it hits a '\n'. After all, the only reason you needed a BufferedReader was for the readLine() function.
Related
I am sending a screenshot from java server hosted on a machine. The screenshot would be send to an android device through sockets in form of a byte stream array. But the array which I am getting in the android device is not getting converted to bitmap. Below I am attaching the java server code, android code and the logcat.
Here is the java server code for sending the screenshot captured.
socket2 = serverSocket2.accept();
System.out.println("A client has connected");
Robot robot = new Robot();
String format = "jpg";
String fileName = "FullScreenshot." + format;
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage screenFullImage = robot.createScreenCapture(screenRect); // captured image
ImageIO.write(screenFullImage, format, new File("C:\\Users\\User\\Desktop\\"+fileName));
ByteArrayOutputStream bao=new ByteArrayOutputStream();
ImageIO.write(screenFullImage,format,bao);//ye
byte[] ar=bao.toByteArray();
System.out.println("Parental block is executed");
mainWriter = new BufferedWriter(new OutputStreamWriter(socket2.getOutputStream()));
mainWriter.write(java.util.Arrays.toString(ar));
mainWriter.newLine();
mainWriter.flush();
System.out.println("A full screenshot saved!");
serverSocket2.close();
socket2.close();
mainWriter.close();
Testing t = new Testing();
Here is the android code where I am getting the byte stream array.
public void PCConnection(final View view) // just for this activity
{
new Thread()
{
public Socket socket;
public void run()
{
try
{
Log.i(DebuggString,"Attempting to connect to the server");
socket = new Socket(hostname,60120);
Log.i(DebuggString,"Connection established");
mivScreenShot = (ImageView) findViewById(R.id.ivScreenShot);
//Receive message from the server
//Message is stored in the br.readLine()
brr = new BufferedReader(new InputStreamReader(socket.getInputStream()));
img = brr.readLine();
Log.d("Image", img);//yeh byte[] display karta h
final ByteArrayInputStream arrayInputStream = new ByteArrayInputStream(img.getBytes());
arrayInputStream.reset();
this.socket.close();
runOnUiThread(new Runnable() {
#Override
public void run() {
Glide.with(ParentalControl.this)
.load(bitmap)
.asBitmap()
.into(mivScreenShot);
}
});
if(bitmap!=null)
{
Log.d(DebuggString,"Bitmap is not null "); // oh ok koi nai
}
else
{
Log.d(DebuggString,"Bitmap is null");
}
}
catch (IOException e )
{
e.printStackTrace();
}
}
}.start();
}
Here I am attaching the logcat screenshot url: https://i.imgur.com/167Vje3.png
mainWriter = new BufferedWriter(new OutputStreamWriter(socket2.getOutputStream()));
mainWriter.write(java.util.Arrays.toString(ar));
You cannot use writers and strings to send a jpg image as those are for text only. Do away with them. Do away with the ByteOutputStream too as you can just directly compress the bitmap to the outputstream of the socket.
Further check how many bytes you send and how many are received.
That was for the server. Also on the receiving side you cannot use readers and strings.
I have coded an app to connect to a network socket on an Rpi over WiFi and have successfully sent data, such as a string, to and from the client and server. The next step is to have the client, my phone in this case, receive and play audio data in real time that is being sent from the server.
So far I have socket and audio variables declared. Once a button is hit, an async task "Listen" is started. I know it can connect to the socket. After that, I'm trying to open an input stream and feed it into a buffer, and then have that buffer played in real time.
I think I'm close to figuring it out, but I don't think I have it quite right yet. I have marked areas I am unsure of with TODO in my code.
public class MainActivity extends AppCompatActivity {
//audio variables
static final int sampleFreq = 44100;
static final int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
static final int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
static final int streamType = STREAM_MUSIC;
static final int audioMode = MODE_STREAM;
static final int bufferSize = getMinBufferSize(sampleFreq, channelConfig, audioMode);
//socket variables
public Socket mySocket;
private int SERVERPORT = 33333;
private static final String SERVER_IP = "172.24.1.1";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
SharedPreferences sharedPref = PreferenceManager.getDefaultSharedPreferences(this);
}
public void onClickListen(View view) {
//call async task
new Listen().execute();
}
private class Listen extends AsyncTask<Void, Void, Void> {
#Override
protected Void doInBackground(Void... params) {
try {
//get IP and port number
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
Log.d(debugStr, "In initial listen connect");
//create socket
mySocket = new Socket(serverAddr, SERVERPORT);
} catch (UnknownHostException e1) {
e1.printStackTrace();
} catch (IOException e1) {
e1.printStackTrace();
} finally {
if (mySocket == null) {
str1 = "Socket became null, listener";
return null;
}
//TODO here is all the new audio stuff
try {
//creates buffer to hold incoming audio data
byte [] audioBuffer = new byte[4096];
//creates input stream readers to read incoming data
BufferedInputStream myBis = new BufferedInputStream(mySocket.getInputStream());
DataInputStream myDis = new DataInputStream(myBis);
Log.d(debugStr, "Input created, listener");
// Read the file into the music array.
int i = 0;
//TODO unsure of while loop condition
while (myDis.read() != -1) {
//since incoming audio is unknown size, use estimated buffer size
audioBuffer[bufferSize-1-i] = myDis.readByte();
i++;
}
//create audio track object
AudioTrack myAudioTrack = new AudioTrack(streamType, sampleFreq, channelConfig, audioEncoding, bufferSize, audioMode);
//TODO write audio data to somewhere?
//TODO should this be in a while loop for live stream?
//TODO will this actually play the audio?
myAudioTrack.write(audioBuffer, 0, bufferSize);
//close input streams
myDis.close();
myBis.close();
} catch (IOException e1) {
e1.printStackTrace();
}
try {
mySocket.close();
} catch (IOException e1) {
e1.printStackTrace();
}
return null;
}
}
}
}
I don't think my while loop condition is correct, or if that is even the correct way to read data from an input stream. I also think that the audioTrack.write should be in the while loop, if that does actually make audio play.
Any help or clarification is greatly appreciated!
I figured it out.
byte [] audioBuffer = new byte[4096];
//creates input stream readers to read incoming data
BufferedInputStream myBis = new BufferedInputStream(mySocket.getInputStream());
DataInputStream myDis = new DataInputStream(myBis);
Log.d(debugStr, "Input created, listener");
AudioTrack myAudioTrack = new AudioTrack(streamType, sampleFreq, channelConfig, audioEncoding, bufferSize, audioMode);
//Log.d(debugStr, String.valueOf(mySocket.getInputStream().read(audioBuffer)));
Log.d(debugStr, "track made");
// Read the file into the music array.
int i = 0;
//TODO unsure of while loop condition
while (mySocket.getInputStream().read(audioBuffer) != -1) {
audioBuffer[audioBuffer.length-1-i] = myDis.readByte();
myAudioTrack.play();
myAudioTrack.write(audioBuffer, 0, audioBuffer.length);
i++;
}
//close input streams
myDis.close();
myBis.close();
Basically, I've been working on a wireless mouse that utilizes bluetooth OR wifi. I've gotten everything working including reading and writing messages. However, the rate at which data transfers through bluetooth is too slow to compensate. I've looked all over the place and I can't figure out what is causing this speed. I am using a dedicated thread to do all the writing operations on.
Here is my ConnectedThread code (almost exactly like in the Android SDK example)
package com.tutorials.jurko.androidmouse;
import android.bluetooth.BluetoothSocket;
import android.net.ConnectivityManager;
import java.io.BufferedOutputStream;
import java.io.IOError;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
/**
* Created by Jurko on 14/02/2015.
*/
public class ConnectedThread extends Thread {
private final BluetoothSocket mmSocket;
private final InputStream mmInStream;
private final BufferedOutputStream mmOutStream;
public static int count = 0;
public ConnectedThread(BluetoothSocket socket) {
mmSocket = socket;
InputStream tmpIn = null;
OutputStream tmpOut = null;
try {
tmpIn = mmSocket.getInputStream();
tmpOut = mmSocket.getOutputStream();
} catch (IOException e) {
e.printStackTrace();
}
mmInStream = tmpIn;
mmOutStream = new BufferedOutputStream(tmpOut);
}
public void write(Byte[] bytes) {
count++;
try {
byte x = bytes[0].byteValue();
byte y = bytes[1].byteValue();
System.out.println("Count: " + count);
byte buf[] = {x, y};
mmOutStream.write(buf);
mmOutStream.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Here is my server code (Receiving messages)
import javax.bluetooth.LocalDevice;
import javax.bluetooth.RemoteDevice;
import javax.bluetooth.UUID;
import javax.microedition.io.Connector;
import javax.microedition.io.StreamConnection;
import javax.microedition.io.StreamConnectionNotifier;
import java.awt.*;
import java.awt.event.InputEvent;
import java.io.*;
/**
* Class that implements an SPP Server which accepts single line of
* message from an SPP client and sends a single line of response to the client.
*/
public class SimpleSPPServer {
//start server
private void startServer() throws IOException, AWTException {
Robot r = new Robot();
//Create a UUID for SPP
UUID uuid = new UUID("1101", true);
//Create the servicve url
String connectionString = "btspp://localhost:" + uuid +";name=Sample SPP Server";
//open server url
StreamConnectionNotifier streamConnNotifier = (StreamConnectionNotifier)Connector.open( connectionString );
//Wait for client connection
System.out.println("\nServer Started. Waiting for clients to connect...");
StreamConnection connection=streamConnNotifier.acceptAndOpen();
RemoteDevice dev = RemoteDevice.getRemoteDevice(connection);
System.out.println("Remote device address: "+dev.getBluetoothAddress());
System.out.println("Remote device name: "+dev.getFriendlyName(true));
//read string from spp client
InputStream inStream=connection.openInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(inStream));
byte[] lineRead = new byte[2];
while(inStream.read(lineRead) != -1) {
System.out.println(lineRead[0] + " " + lineRead[1]);
// Code to control mouse here
}
//send response to spp client
OutputStream outStream=connection.openOutputStream();
PrintWriter pWriter=new PrintWriter(new OutputStreamWriter(outStream));
pWriter.write("Response String from SPP Server\r\n");
pWriter.flush();
pWriter.close();
streamConnNotifier.close();
}
public static void main(String[] args) throws IOException, AWTException {
//display local device address and name
LocalDevice localDevice = LocalDevice.getLocalDevice();
System.out.println("Address: "+localDevice.getBluetoothAddress());
System.out.println("Name: "+localDevice.getFriendlyName());
SimpleSPPServer sampleSPPServer=new SimpleSPPServer();
sampleSPPServer.startServer();
}
}
Someone asked for an answer, so here it is. Out of complete coincidence (and boredom I guess), I decided to fix ALL of the warnings Android Studio was giving me.
One of the warnings indicated that I was instantiating new objects in the onDraw function. Turns out that the bluetooth was not slow, but in fact the onDraw took so long it delayed the transmission of new messages, making it appear as if a consistent lag was present.
tl;dr: Don't instantiate new objects in your onDraw function.
I have an android application connected to a Bluetooth mate silver chip. I'm in the process of testing the send/receive function of it. Mostly I have been following the bluetooth examples on the android dev site.
I can tell sending data works because when I write("$$$") to the chip, it enters command mode and flashes it's status LED very quickly. When the chip enters command mode, it sends a reply: "CMD". I am having trouble receiving this reply.
When I press a button, the following code is executed. mct is the global ConnectedThread that I am using to read and write. As poor form as it is, all functions are inside MainActivity.java
if(connected){
if (cmdMode == false){
mct.write("$$$".getBytes()); //enter command mode
mct.listen();
TextView lbl_history = (TextView) findViewById(R.id.lbl_history);
lbl_history.setText(message);
cmdMode = true;
}
else{
mct.write("k,\n".getBytes()); //kill the connection
cmdMode = false;
}
}
My communication thread:
private class ConnectedThread extends Thread {
private final BluetoothSocket mmSocket;
private final InputStream mmInStream;
private final OutputStream mmOutStream;
public ConnectedThread(BluetoothSocket socket) {
mmSocket = socket;
InputStream tmpIn = null;
OutputStream tmpOut = null;
try {
tmpIn = socket.getInputStream();
tmpOut = socket.getOutputStream();
} catch (IOException e) { }
mmInStream = tmpIn;
mmOutStream = tmpOut;
}
public void listen() {
handled = false;
byte[] buffer = new byte[1024]; // buffer store for the stream
int bytes; // bytes returned from read()
reply=null;
while (reply==null) {
try {
// Read from the InputStream
bytes = mmInStream.read(buffer);
reply = buffer.toString();
//message is a global String to store the latest message received
message = reply;
} catch (IOException e) {
break;
}
}
reply = null;
}
//write and cancel functions removed for simplicity
}
When I run this code, the result is a textview that says "[B#415f8910", which I assume is junk. Multiple runs of the same code will produce similar results, with the last few digits varying. The expected result would be "CMD". Any ideas on what the problem is here? I am new to android development, so any help is appreciated.
Further inspection reveals that multiple runs strictly increase "[B#415f8910", leading me to believe that it is a memory address. Still, I don't know What to do with it though.
I found the problem. Rather than straight up calling "toString()" on the array of bytes, I needed to call the String constructor to properly convert the data:
String message = new String(buffer, "UTF-8");
specifying UTF-8 is what made the difference.
I am trying to get the current time and date in an Android application and transmit it via Bluetooth. I have tried using both Time and Calendar to get the hour, minute, second, month, day, and year minus 2000. I then tried to cast each value to a byte and placed them into a byte array. However, when I try to send the values over Bluetooth they come out wrong on the other side. The format that I'm shooting for is a header (0xFF) followed by hour, minute, second, month, day, year.
public class Bluetooth extends Activity{
private static final UUID MY_UUID = UUID.fromString("00001101-0000-1000-8000-00805F9B34FB");
private String bt_address;//00:18:E4:OC:67:FF
private BluetoothAdapter mBluetoothAdapter = null;
private BluetoothSocket btSocket = null;
private OutputStream outStream = null;
private InputStream inStream = null;
#Override
protected void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.bluetooth);
Button connect = (Button)findViewById(R.id.Btn_Connect);
try{
/*code removed, reads bt_address from a file or creates a file if no file exists yet*/
} catch(IOException ioe){
ioe.getStackTrace();
}
connect.setOnClickListener(new OnClickListener(){
public void onClick(View v){
try{
/*code removed, saves bt_address to a file*/
}
mBluetoothAdapter = BluetoothAdapter.getDefaultAdapter();
/*code removed, prompts user to turn on Bluetooth if not already on*/
BluetoothDevice device = mBluetoothAdapter.getRemoteDevice(bt_address);
try {
btSocket = device.createRfcommSocketToServiceRecord(MY_UUID);
} catch (IOException e) {}
mBluetoothAdapter.cancelDiscovery();
try {
btSocket.connect();
} catch (IOException e) {
try {
btSocket.close();
} catch (IOException e2) {}
}
try {
outStream = btSocket.getOutputStream();
} catch (IOException e) {}
try {
inStream = btSocket.getInputStream();
} catch (IOException e) {}
Time time = new Time();
byte[] msgbuf = new byte[7];
msgbuf[0] = (byte)0xFF;
msgbuf[1] = (byte)time.hour;
msgbuf[2] = (byte)time.minute;
msgbuf[3] = (byte)time.second;
msgbuf[4] = (byte)(time.month + 1);
msgbuf[5] = (byte)time.monthDay;
msgbuf[6] = (byte)(time.year - 2000);
outStream.write(msgbuf);
}
)};
This code is set up to connect to a device using its Bluetooth address and send it the time stamp when a button is clicked. It will connect to the device and send a long string of numbers in the process, but I'm beginning to think that it disconnects before it sends the time stamp.
You send just one byte. Method outStream.write(int) needs int as parameter (only 8 bytes are important). So you send correctly. Have you flush messages?
It is important how you receive this message? You should read InputStream.read() method and cast integer to byte.
Maybe better solution is send bytes as outStream.write(msgbuf) and read as inputStream.read(receiveBuffer);
The Android Time class has built-in functions to convert a Time object into two different standardized string formats: format2445() and format3339(). They are already there for you, implement open standards, and somebody else has done the QA.
The strings are going to be slightly longer than the format you are working with, but at this size I doubt it will matter. Likewise, they will be only slightly more difficult for you to decode on the receiving side. Balancing that, since it's a human-readable string, it will be easier for you to debug.
I suggest you use one of these two string formats.