I am working on a game and what I want to do is take a screenshot when the player dies and upload that image using the iOS sharesheet functionality to share to twitter, facebook etc... I have the image stored locally.
Here is my code for sharing an image link, but I want to upload a local image instead.
Any help would be much appreciated.
public void Share(int score)
{
UIViewController rootViewController = new UIViewController();
NSURL url = new NSURL("http://i.imgur.com/iWKad22.jpg");
NSData data = NSData.read(url);
UIImage image = new UIImage(data);
NSString textShare = new NSString("My Score: " + score + "! #Udderpanic");
NSArray<NSObject> textToShare = new NSArray<NSObject>(textShare, image);
UIActivityViewController share = new UIActivityViewController(textToShare,null);
((IOSApplication)Gdx.app).getUIViewController().addChildViewController(rootViewController);
if(UIDevice.getCurrentDevice().getModel().contentEquals("iPad"))
{
final UIPopoverController popoverController = new UIPopoverController(share);
popoverController.presentFromRectInView(new CGRect(0, 400, 0, 400), rootViewController.getView(), UIPopoverArrowDirection.Right, true);
}
else
{
rootViewController.presentViewController(share, true, null);
}
}
Related
This is the qr-code generator, I put on String qrCodeData to try access the storage of my phone and open up a file, but it doesnt work. Turns out the generated qr code only gives the link.
public class QRCode {
public static void main(String[] args) {
try {
String qrCodeData = "Device storage/Download/japanese/Mastering_Kanji_1500.pdf";
String filePath = "D:\\QR code project\\Generated QR codes\\qr.png";
String charset = "UTF-8"; // or "ISO-8859-1"
Map < EncodeHintType, ErrorCorrectionLevel > hintMap = new HashMap < EncodeHintType, ErrorCorrectionLevel > ();
hintMap.put(EncodeHintType.ERROR_CORRECTION, ErrorCorrectionLevel.L);
BitMatrix matrix = new MultiFormatWriter().encode(
new String(qrCodeData.getBytes(charset), charset),
BarcodeFormat.QR_CODE, 200, 200, hintMap);
MatrixToImageWriter.writeToFile(matrix, filePath.substring(filePath
.lastIndexOf('.') + 1), new File(filePath));
System.out.println("QR Code image created successfully! and stored at location"+filePath);
} catch (Exception e) {
System.err.println(e);
}
}
}
We are able to view and manipulate PDF files via the PDFBox library.
Android version.
We may also use MuPDF. It has an Android version.
Interpret the received link as a file or download it to storage, then proceed to interface with PDFBox library.
Note that file downloading and access on Android should now be done via Room interface or SQLite as recommended by Google.
Hope this helps.
I have an api to download file. It is able to download file but showing only after download completes. there is no download progress.
I want when user hit that url it will show download progress in chrome, currently it is showing after completion.
I am using spring boot.
public responseEntity<Resource>getFile(String fileName){
byte[] data=null;
File file=new File(fileName);
InputStream inputStream=new FileInputStream(file);
data=IOUtils.toByteArray(inputStream);
ByteArrayResource fileToDownload = new ByteArrayResource(data);
return ResponseEntity.ok()
.contentType(MediaType.parseMediaType("application/octet-stream"))
.header("Content-Disposition", "filename=" + fileName)
.body(fileToDownload);
}
Use JavaScript in your webpage. This has nothing to do with how the server sends the file, and must be displayed client-side -- well, the server could output somewhere how far it is along sending the file, but that is not what you want to show - you are interested in showing how much you have received, and showing it in the client; so any answer will have to rely on JS+html to an extent. Why not solve it entirely in the client side?
In this answer they use the following code:
function saveOrOpenBlob(url, blobName) {
var blob;
var xmlHTTP = new XMLHttpRequest();
xmlHTTP.open('GET', url, true);
xmlHTTP.responseType = 'arraybuffer';
xmlHTTP.onload = function(e) {
blob = new Blob([this.response]);
};
xmlHTTP.onprogress = function(pr) {
//pr.loaded - current state
//pr.total - max
};
xmlHTTP.onloadend = function(e){
var fileName = blobName;
var tempEl = document.createElement("a");
document.body.appendChild(tempEl);
tempEl.style = "display: none";
url = window.URL.createObjectURL(blob);
tempEl.href = url;
tempEl.download = fileName;
tempEl.click();
window.URL.revokeObjectURL(url);
}
xmlHTTP.send();
}
Note that you are missing part where you display the progress somewhere. For example, you could implement it as follows:
xmlHTTP.onprogress = function(pr) {
//pr.loaded - current state
//pr.total - max
let percentage = (pr.loaded / pr.total) / 100;
document.getElementById("progress").textContent = "" + percentage + "% complete";
};
This assumes that there is something like
<span id="progress">downloading...</span>
in your html
My Java program downloads static map images from Google maps which shows the route line. If I go to this link in my browser, I get the correct image which is a map with directional polyline.
But when I download the image from the same URL with my Java program, I get this instead:
Both URLs look the same to me, I can't work out what's wrong. Here is my code if anyone can spot something out of place?
Code:
try {
String mapImgUrl = "https://maps.googleapis.com/maps/api/staticmap?size=300x300&path=enc:" + polyline + "&key=AIzaSyBn2qYJcHoNCgNQZv1mcycnUo06sJDZPBs";
String imageFileName = houseNumber + " " + address + ".jpg";
URL url = new URL(mapImgUrl);
InputStream is = url.openStream();
OutputStream os = new FileOutputStream(imageFileName);
byte[] b = new byte[2048];
int length;
while ((length = is.read(b)) != -1) {
os.write(b, 0, length);
}
is.close();
os.close();
ImageIcon imgIcon = new ImageIcon((new ImageIcon(imageFileName))
.getImage().getScaledInstance(400, 400, java.awt.Image.SCALE_SMOOTH));
SwingUtilities.invokeLater(new Runnable() {
public void run() {
JLabel labelMap = new JLabel();
labelMap.setIcon(imgIcon);
panelMap.add(labelMap);
}
});
The polyline data is correct, I have compared the data I get from the JSON in my browser with the data I get from my program, and they match. This is the polyline data straight from the API if it helps anyway:
c{utHdfqJJaA`AoI\\oATo#Xe#P[R_#NYFYHs#AGAQDe#LYHGNCF?h#[Za#bBuDtA_Dp#gAR[J[#i#?SF]FIJEJAj#m#f#iAbByDlQoa#Pm#By#CS?YBSFOHKLCXOr#SbE}G~#gBfAiBdH_MjAwBFa#j#_ARYQg#kAkDIBIEEI?M#GMMQYgBiEaD{HJOH[`Hy`#d#iC{CgBgEcC}CeBb#gC
I think the issue is you haven't URL encoded the poly line.
Use: java.net.URLEncoder
String mapImgUrl = "https://maps.googleapis.com/maps/api/staticmap?size=300x300&path=enc:"
+ URLEncoder.encode(polyline, "utf-8") + "&key=<key>";
Note the encode(String) method which doesn't require the character encoding is deprecated. Also, make sure you import from the public package, java.net
I found the problem. The polyline portion of the URL was null due to the thread that gets the polyline data not finishing before the thread that sets up the URL started. I solved this by joining the threads.
So for android devices there is a default share intent function that you call that will list all the apps that is downloaded on the device with that allows sharing of content. How would I do this using robovm, here is a Screen shot of what I am trying to achieve. Also on a side note, what I want to ultimately do is take a screenshot of their device and post it to whatever social media site the user chooses. Any help would be greatly appreciated :D
For iOS users:
RoboVM is deprecated in favor of RoboPods. So, RoboVM is no more, what now?
https://github.com/robovm/robovm-robopods/tree/master
For sharing text using roboVM, you can set the UIAvtivityViewController class
NSString textShare = new NSString("This is the text to share");
NSArray texttoshare = new NSArray(textShare);
UIActivityViewController share = new UIActivityViewController(texttoshare,null);
presentViewController(share, true, null);
For sharing text, image and app, you can do the following code,
NSURL imgUrl = new NSURL(EXTERNAL_IMG_URL);
NSData data = NSData.read(imgUrl);
UIImage image = new UIImage(data);
NSString appStoreUrl = new NSString(APP_STORE_URL);
NSString googleUrl = new NSString(GOOGLE_PLAY_URL);
NSString text = new NSString(TEXT_TO_SHARE);
NSArray<NSObject> texttoshare = new NSArray<NSObject>(text, image, appStoreUrl, googleUrl);
UIActivityViewController share = new UIActivityViewController(
texttoshare, null);
if (UIDevice.getCurrentDevice().getUserInterfaceIdiom() == UIUserInterfaceIdiom.Phone) {
//iPhone
iosApplication.getUIViewController().presentViewController(
share, true, null);
} else {
//iPad
UIPopoverController popover = new UIPopoverController(share);
UIView view = iosApplication.getUIViewController().getView();
CGRect rect = new CGRect(
view.getFrame().getWidth() / 2,
view.getFrame().getHeight() / 4,
0,
0);
popover.presentFromRectInView( rect, view, UIPopoverArrowDirection.Any, true);
}
The following code sends an animated .gif through Mail and Messenger, however, Twitter only shares a static image.
public void shareGif(String path)
{
NSURL url = new NSURL(new File(path));
NSArray<NSObject> imageArray = new NSArray<NSObject>(image);
UIActivityViewController share = new UIActivityViewController(imageArray,null);
((IOSApplication)Gdx.app).getUIViewController().presentViewController(share, true, null);
}
For generic share in iOS, with text, link and image the code is given below:
#Override
public void share() {
NSArray<NSObject> items = new NSArray<>(
new NSString("Hey! Check out this mad search engine I use"),
new NSURL("https://www.google.com"),
new UIImage(Gdx.files.external("image.png").file())
);
UIActivityViewController uiActivityViewController = new UIActivityViewController(items, null);
((IOSApplication) Gdx.app).getUIViewController().presentViewController(uiActivityViewController, true, null);
}
For Android users: Sharing Content with intents
It says RoboVM is for iOS only, So how would we use it for android using LibGDX?
roboVM with RoboPods is used for iOS while google play services is used for android. Different devices, different code. For android, after integrating the android sdk to your IDE, just link the google play services lib to the android project.
For my android game, based around AndEngine, I am currently working on loading levels.
I came across AndEngine-PhysicsEditor-Extension, which loads an XML and graphics exported by PhysicsEditor.
I am now working on creating a class, conveniently named MapLoader, which takes a map name and calls the PhysicsEditor Extension to load the appropriate files and register them. The problem is that while the graphics are displayed as intended, anything else in the scene pass right through it without colliding.
Here's how the MapLoader is invoked:
MapLoader loader = new MapLoader(this, engine, world); // (Context, PhysicsWorld, Scene)
loader.loadMap("testmap1");
And here's the important bits of the MapLoader class:
// Constructor
public void loadMap(String mapName) {
this.atlas = new BitmapTextureAtlas(game.getTextureManager(), 2048, 1024, TextureOptions.BILINEAR);
atlas.load();
PhysicsEditorLoader loader = new PhysicsEditorLoader();
this.tMap = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.atlas, game, "maps/" + mapName + ".png", 0, 0);
this.sMap = new Sprite(0, 500, this.tMap, game.getVertexBufferObjectManager());
this.world.attachChild(sMap);
try {
loader.load(game, engine, "maps/" + mapName + ".xml", sMap, true, true);
} catch (IOException e) {
e.printStackTrace();
System.out.println("XML load failure");
}
this.sMap = new Sprite(0, 0, this.tMap, game.getVertexBufferObjectManager());
world.attachChild(sMap);
world.registerUpdateHandler(engine);
}
// .......Snip...........
// loadMap method
public void loadMap(String mapName) {
this.atlas = new BitmapTextureAtlas(game.getTextureManager(), 2048, 1024, TextureOptions.BILINEAR);
atlas.load();
final PhysicsEditorLoader loader = new PhysicsEditorLoader();
this.tMap = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.atlas, game, "maps/" + mapName + ".png", 0, 0);
this.sMap = new Sprite(0, 500, this.tMap, game.getVertexBufferObjectManager());
this.world.attachChild(sMap);
try {
loader.load(game, engine, "maps/" + mapName + ".xml", sMap, true, true);
} catch (IOException e) {
e.printStackTrace();
}
this.sMap = new Sprite(0, 0, this.tMap, game.getVertexBufferObjectManager());
world.attachChild(sMap);
world.registerUpdateHandler(engine);
}
Did i miss something? I based the above code on the example.
I've tried editing the source to point to a non-existing directory for the XML loading, after which it started complaining, indicating that the XML is found.
Edit: In case something inherently wrong with the XML is suspected, I've uploaded it for inspection here. It was created in PhysicsEditor by adding sprite, using the shape tracer to get the outline, and export as AndEngine exporter. The XML is stored in assets/maps/ with the associated sprite stored in assets/gfx/maps/
PS: Until i get this test working I'm using the trial version of PhysicsEditor. I do not believe it affects the outcome.