Connecting to API - java

I plan to build a simple website that takes data from a websites API and puts it into charts that are listed on my website. Here is what I am trying to do:
Take data from the API listed on localbitcoins.com
https://localbitcoins.com/api-docs/
Code a program that parses this data as I see fit.
Create a graphical layout that displays the data.
Post the graphs on a website that I have created, ideally to update in real-time.
I don't know where to begin.
I am not asking for some one to hold my hand through it all, but more so to give me some pointers on where to start, what resources are there that I can look at, and so on.
My instinct tells that that I need to tackle API and coding part first. Can someone point me to a resource that could take me through this? Should I stick with Java or should I use another language for this?

Related

Is there a tool/API to extract Minecraft recipes?

A friend and I decided to code a discord bot in Java, using JDA. The idea is for the bot to give you a request Minecraft recipe (a picture of it). However, we don't want to have to download every single recipe (there are way too many of them). So I was wondering if there's something we can use that would give us the recipes with pictures and everything, like an API or a website that we can access from the Java code that would return something we can use. (No code attached since we haven't really done anything and it would serve no purpose).
I am not sure if an API exists that can do that for you. If the problem is spending the time to download every single recipe I might recommend creating a webscraper that could get that data from you. Getting the images from a site like https://www.minecraftcrafting.info/ would be fairly straight forward using python and beautifulsoup (https://pypi.org/project/beautifulsoup4/). Hope that helps and good luck!

How can I make my java program retrieve data sets online?

Currently I am working on a program that will assist me for making decisions when trying to bet on sports. My goal is for the program is that each day I would like to retrieve things like weather, past games, player/team stats etc. then aggregate it all so that I can see which teams make the most sense to bet on.
I'm not exactly sure if it's even possible to do on IntelliJ, the text editor I'm using, because I do not think its connected to the internet on its own. I think one approach would be to use a separate program (not IntelliJ) to automatically go to each website and copy the appropriate information into an excel document; then I could copy the file into my project each day before I run it. Something like that is what I have in mind, but I would appreciate some help if anyone knows which strategy I could use to move past this obstacle.
I've recently learned how to and created a GUI to navigate my program a little easier than through the console; therefore, my work ethic is not an barrier in this instance. I've taken one programming class in college and would consider myself an apprentice (one step above a novice).
You can use jsoup for scraping data from a website, Then you can use Apache POI to add it to an excel file.
Heres the website https://jsoup.org/
Heres a good tutorial about apache POI https://www.baeldung.com/java-microsoft-excel

making a browser without using JTextPane or any other class that reads HTML

Good evening, I'm working on a project with a team, we have to make a browser without using JEditorPane or any other class that reads HTML.
How can we do that? Do we need to make a new class that does what JEditorPane does? Can I find somewhere JEditorPane's code? Thanks!
Well, this is an answer:
If you need to display web content without using any pre-existing engine (such JEditorPanel or a ChromeBind), you need to read the HTML as a XML file and construct your native View based on it (without CSS and JS this is a fairly easy task) by constructing the screen based on a one-to-one equivalent of a HTML tag to a Java JComponent.
Modern Web Browsers are pretty complicated, so there are a lot of different pieces that come together to display a web page. In order to build a browser, you need to first understand what a browser is. For that, I recommend reading this tutorial.
Once you have an understanding of how a browser actually works you need to determine which pieces you can reuse and which pieces you have to write from scratch. Do you have to write the entire rendering engine? Good luck! Can you use an existing engine like Gecko or Webkit? Or maybe you can get a little closer to done and use the java port of Webkit?
Once you have a better understanding of the question come back and ask more direct questions when you get stuck at a specific piece. As it is, your first step is to gain an understanding of the problem you are trying to solve.

How can i use Java to call an existing RPG screen program?

I have existing RPG4 programs with green screens, i would like to be able to call the rpg programs with Java and bypass the green screens.
I have done some research on this and IBM OAR (Open Access: RPG) keeps coming up. but i have not found a working example yet.
My goal is to create a web app to collect the the same information and feed it to the back end RPG
any help would be greatly appreciated
EDIT
Delete: You can't.
Insert: A beginner will need to master several complex new concepts before tackling this.
END-EDIT
At least, not without changing the RPG program. Web requests are processed by server jobs, which run in batch - they are not connected to a 5250 terminal. Because they aren't connected to a terminal, when the RPG program tries to open the display file, it will fall over because there's no terminal to attach to.
In order for this to work you'd have to alter the RPG program to not try display file I/O if called by a batch process like a Java app (although Java isn't necessary in this web scenario).
One way to change the RPG program is to use input parameters; if you have them, then don't try to open the display file, but stuff the input parameters into the fields where the display file would have done. Since a display file also outputs from the program you'd need to reserve some parameters for the output information as well. This could get very ugly if a subfile is involved, as there would be potentially thousands of parameters.
OAR comes into the picture because one can write an OAR handler that continues to use the same display file I/O operations, but to direct the actual I/O elsewhere, like STDIN and STDOUT for an HTTP type application. Jon Paris and Susan Gantner have written an article called Getting a Handle on RPG's Open Access which you might find helpful. It's in the July 2010 e-edition of IBM Systems Magazine.
Better perhaps is to extract the business logic in the RPG program, implement it as stored procedures which can be called by the web application via traditional ODBC / JDBC. One can write stored procedures in RPG, so that's not as hard as it might seem.
OAR is probably going to be your best bet....
However, every example I can think of that I've seen has resolved around building a handler to replace a printer file (PRTF) or physical file (PF).
Replacing a display file (DSPF) is a whole other ball game. Primarily because the 5250 protocol is an "intelligent" protocol; unlike dumb character type protocols such as used by ANSI/VT100.
It certainly can and had been done. If you have a single basic screen, you might be able to do it. But for a complex application with multiple screens and subfiles you'll probably have a tough time. Especially if you don't have a in depth understanding of the 5250 protocol.
I'd recommend you take a look at one of the vendor toolset designed to use OAR to replace a 5250 screen with a web page. Those vendors have put years of time and effort into developing the handler needed.
http://www.profoundlogic.com/solutions/rpg-application-modernization.html
https://asna.com/us/products/wings
You might find the following publication useful:
Modernizing IBM i Applications
Lastly note that ROA isn't the only option. There's an older technique, "screen scraping" in which your application basically emulates a 5250 terminal. It's simpler than a full ROA handler, but the end result is simpler also. IBM has it's own tool, HATS. And for instance Profound logic also has a tool, GENIE. But you could conceivably build your own screen scraper, the opensource TN5250J would probably be a place to start. But even this would be non-trivial.
You should use a mix of parsin json on the iseries(this eliminates the subfile problems), one good javascript framework( I've used Extjs) and The Apache server for I.
I've developed a HTTP services framework based on json parameters send directly from the browser using Ajax, processing each request with any ILE language program(mostly rpgle) and returning the result in pure json created inside the program. With this approach, you just send/receive business data, leaving the front-end to the Javascript framework.
Hope this helps. Contact me if you need more help.

Method of extracting data from a website?

I want to grab data from a website (for example, the names, identification number, and list of resources someone is using) and post it to another website.
What I was thinking of doing was using cURL to grab the information from an existing REST api on one website. Then, what I wanted to do is write a program or an api to post that information onto another website.
Upon using a cURL, how/where can I store that information so that I can use it via another program? Would it be easier to write one single program that extracts the information from the first website and posts it to the other? If so would it be possible to do so using Java/give an idea on how to do so? I'm not asking for code, just a method to do this. I'm using the Eclipse for Java Web EE developer's IDE.
I'd write it as 2-3 programs. One that extracts the data, one that formats the data (if necessary), one that posts the data.
My gut tells me the easiest way to do this is a pure bash script. But if you want to use Java for this you can.
I would save the output in a file for the post-er to read from. This has the benefit of letting you write/test the poster without the 2 other programs working. That said, I recommend you write the get-er program first. That way you know what data you're really dealing with.
Now, if you happen to write both the formatter and the post-er in java, I would write this as one program instead of "piping" files between them. The formatter will read in the file, turn it into a data structure/class, and the post-er will read this data structure/class.
This is only superficially different from my previous paragraph. The point is each "part" is independent from each other. This allows you to test a part without running the whole thing. That's the important thing.
As for how/where to store the information from the get-er, just redirect it to a file. Here's a tutorial on how.
Truth be told, I can't tell if you're using the linux cURL program or a java implementation like this one. My answer would be very different depending on this.

Categories

Resources