UnknownHostException behind Proxy - java

I have a java program, that connects to a website to retrieve some XML from it. This works fine on my computer, as well as others outside our company. One of our customers is now not able to connect to the website. I figured out, that they are behind a proxy. I have now found which settings I need to use, and in my test program it works (partially).
In the code below, the downloadFile() call works as expected, and the file can be downloaded without problems. The contactHost() fails on our client machines with an UnknownHostException:
java.net.UnknownHostException: No such host is known (api.myserver.de)
at java.base/java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(InetAddress.java:925)
at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1505)
at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:844)
at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1495)
at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1354)
at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1288)
at org.apache.http.impl.conn.SystemDefaultDnsResolver.resolve(SystemDefaultDnsResolver.java:45)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:111)
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
Background: Windows 10 machines, our program is shipped with an internal OpenJDK, version "10.0.2" 2018-07-17. The program is started with the following defines -Djdk.http.auth.tunneling.disabledSchemes="" -Djava.net.preferIPv4Stack=true in order to use IP4 only, and to enable BasicAuthentification for the Proxy. With these settings, the file can be downloaded, however the UnknownHostException is still there.
We have also tried to open the used URL in an browser, and this works as excepted, i.e. in the browser the website is opened.
Here is my code for testing:
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.Authenticator;
import java.net.URL;
import java.net.URLConnection;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import org.apache.http.HttpResponse;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
public class LFTProxyTest {
private static String uname = null;
private static String pass = null;
public static void main(String[] args) {
System.setProperty("java.net.useSystemProxies", "true");
// uname = "test"; // whatever that user provides
// pass = "sectret"; // whatever that user provides
Authenticator.setDefault(new ProxyAuth(uname, pass));
contactHost();
downloadFile();
}
private static boolean downloadFile() {
System.out.println("CHECK connection");
int cp = contactHost();
if (cp == 200)
return true;
if (cp == 407)
return false;
else {
try {
System.out.println("Try loading file: ");
URL url = new URL("https://www.google.de");
URLConnection urlConnection = url.openConnection();
InputStream in = new BufferedInputStream(urlConnection.getInputStream());
DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
dBuilder.parse(in);
System.out.println(" FILE DOWNLOAD successfull!");
} catch (Exception e) {
System.out.println(" FILE DOWNLOAD failed:");
System.out.println("***EXCEPTION: " + e.getMessage());
return false;
}
}
System.out.println("CHECK done");
return true;
}
private static int contactHost() {
HttpClient client = HttpClientBuilder.create().build();// new DefaultHttpClient();
String catalogURI = "https://api.myserver.de/query";
HttpGet request = new HttpGet(catalogURI);
try {
int ret = 0;
HttpResponse response = client.execute(request);
ret = response.getStatusLine().getStatusCode();
System.out.println("PROXY test: " + ret);
((CloseableHttpClient) client).close();
return ret;
} catch (IOException e) {
e.printStackTrace();
return -1;
}
}
}
I don't know what do do know, I'm not even sure where the error could be. Any ideas are highly appreciated!

Ok, so after some further digging, I found out that org.apache.http.client.HttpClient is not respecting java.net.useSystemProxies at all, be it set via System or via -D. And it is also ignoring http.proxyHost etc. Solution is to use a ProxySelector like this:
ProxySelector.setDefault(new ProxySelector() {
#Override
public List<Proxy> select(URI uri) {
ArrayList<Proxy> list = new ArrayList<Proxy>();
list.add(new Proxy(Proxy.Type.HTTP, new InetSocketAddress("proxy1.de", 8000)));
list.add(new Proxy(Proxy.Type.HTTP, new InetSocketAddress("proxy2.de", 8080)));
return list;
}
#Override
public void connectFailed(URI uri, SocketAddress sa, IOException ioe) {
logger.error("Error in ProxySelector, connection Failed: ", ioe);
}
});
I'm getting another exception now, but I might open another thread for this.

UnknownHostException designates a pretty straight forward problem. That the IP address of the remote host you are trying to reach cannot be resolved. So the solution to this is very simple. You should check the input of Socket (or any other method that throws an UnknownHostException), and validate that it is the intended one. If you are not whether you have the correct host name, you can launch a UNIX terminal and use the nslookup command (among others) to see if your DNS server can resolve the host name to an IP address successfully.
If you are on Windows you can use the host command. If that doesn’t work as expected then, you should check if the host name you have is correct and then try to refresh your DNS cache. If that doesn’t work either, try to use a different DNS server, eg Google Public DNS is a very good alternative.

Related

Redirect Client in Java-Run Server

I am creating a Java HTTP server that checks to make sure a client is not banned before redirecting to the main server. I have already created everything for the server that is needed, I just don't know how to redirect to another port that is running the main server. Here is my code:
package netlyaccesscontrol;
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.net.ServerSocket;
import java.net.Socket;
public class AllowedCheck {
public static void main(String[] args) {
String line = null;
try {
FileReader reader = new FileReader("Banned.txt");
BufferedReader buffer = new BufferedReader(reader);
ServerSocket s = new ServerSocket(80);
Socket c = s.accept();
String clientIP = c.getInetAddress().toString();
while ((line = buffer.readLine()) != null) {
if (clientIP == line) {
s.close();
} else {
// redirect to main server here
}
}
} catch (FileNotFoundException ex) {
System.out.println("The banned IP address file does not exist.");
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
The redirection that you are thinking of is something supported by HTTP and the browsers. There's a specific HTTP response code that tells the caller to redirect and a way to specify it.
Raw sockets are a low-level network protocol that is not going to support redirection as you expect. The most you might be able to do is have this program be a proxy and, upon success, push all incoming data/outgoing responses to/from the ultimate server. But what you have here is by no means going to cut it.

Java Current Directory Returns `null`

I'm trying to use the printWorkingDirectory() from Apache Commons FTP but it's only returning null. I can't navigate directories, list files, etc.
Log in pass all is success but how ever I try I can not change current directory.
I use this following code:
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.commons.net.ftp.FTP;
import org.apache.commons.net.ftp.FTPClient;
import org.apache.commons.net.ftp.FTPFile;
public class FTPDownloadFileDemo {
public static void main(String[] args) {
String server = "FTP server Address";
int port = portNo;
String user = "User Name";
String pass = "Pasword";
FTPClient ftpClient = new FTPClient();
String dir = "stocks/";
try {
ftpClient.connect(server, port);
ftpClient.login(user, pass);
System.out.println( ftpClient.printWorkingDirectory());//Always null
//change current directory
ftpClient.changeWorkingDirectory(dir);
boolean success = ftpClient.changeWorkingDirectory(dir);
// showServerReply(ftpClient);
if (success)// never success
System.out.println("Successfully changed working directory.");
System.out.println(ftpClient.printWorkingDirectory());// Always null
} catch (IOException ex) {
System.out.println("Error: " + ex.getMessage());
ex.printStackTrace();
} finally {
try {
if (ftpClient.isConnected()) {
ftpClient.logout();
ftpClient.disconnect();
}
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
This is rather old question that deserves an answer. This issue is likely a result of using FTPClient when secure connection is required. You may have to switch to FTPSClient if that is, indeed, the case. Further, output the response from the server with the following code snippet to troubleshoot the issue if secure client doesn't solve the it:
ftpClient.addProtocolCommandListener(
new PrintCommandListener(
new PrintWriter(new OutputStreamWriter(System.out, "UTF-8")), true));
Also, a server can reject your login attempt if your IP address is not white listed. So, being able to see the logs is imperative. The reason you see null when printing current working directory is because you are not logged in. Login method will not throw an exception but rather return a boolean value indicating if the operation succeeded. You are checking for success when changing a directory but not doing so when logging in.
boolean success = ftpClient.login(user, pass);
I faced the same, but I came across with a simple step.
Just added this.
boolean success = ftpClient.changeWorkingDirectory(dir);
ftpClient.printWorkingDirectory(); //add this line after changing the working directory
System.out.println(ftpClient.printWorkingDirectory()); //wont be getting null
Here I have the code and the console output
FTPClient.changeWorkingDirectory - Unknown parser type: "/Path" is current directory
I know I replied too soon ;-P, but I saw this post recently. Hope this helps to future searchers ;-)

Using NTLM authentication in Java applications

I want to use Windows NTLM authentication in my Java application to authenticate intranet users transparently. The users should not notice any authentication if using their browsers (single sign-on).
I've found a few libs with NTLM support, but don't know which one to use:
http://spnego.sourceforge.net/
http://sourceforge.net/projects/ntlmv2auth/
http://jcifs.samba.org/
http://www.ioplex.com/jespa.html
http://www.luigidragone.com/software/ntlm-authentication-in-java/
Any suggestions where to start?
Out of the above list, only ntlmv2-auth and Jespa support NTLMv2. Jespa is workable but commercial. ntlmv2-auth I haven't tried but it's based on the code from Liferay, which I've seen working before.
'ntlm-authentication-in-java' is only NTLMv1, which is old, insecure, and works in a dwindling number of environments as people upgrade to newer Windows versions. JCIFS used to have an NTLMv1 HTTP auth filter, but it was removed in later versions, as the way it was implemented amounts to a man-in-the-middle attack on the insecure protocol. (The same appears to be true of 'ntlm-authentication-in-java'.)
The 'spnego' project is Kerberos not NTLM. If you want to replicate full IWA as IIS does it, you'd need to support both NTLMv2 and Kerberos ('NTLM' auth, 'Negotiate' auth, NTLMSSP-in-SPNego auth and NTLM-masquerading-as-Negotiate auth).
Luigi Dragone's script is really old and seems to always fail.
HttpURLConnection can work with NTLM if you add library jcifs, this example works with latest jcifs-1.3.18 :
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.Map;
import org.apache.http.impl.auth.NTLMEngineException;
public class TestNTLMConnection {
public static void main(String[] args) throws UnknownHostException, IOException, NTLMEngineException {
// Method 1 : authentication in URL
jcifs.Config.registerSmbURLHandler();
URL urlRequest = new URL("http://domain%5Cuser:pass#127.0.0.1/");
// or Method 2 : authentication via System.setProperty()
// System.setProperty("http.auth.ntlm.domain", "domain");
// System.setProperty("jcifs.smb.client.domain", "domain");
// System.setProperty("jcifs.smb.client.username", "user");
// System.setProperty("jcifs.smb.client.password", "pass");
// Not verified // System.setProperty("jcifs.netbios.hostname", "host");
// System.setProperty("java.protocol.handler.pkgs", "jcifs");
// URL urlRequest = new URL("http://127.0.0.1:8180/simulate_get.php");
HttpURLConnection conn = (HttpURLConnection) urlRequest.openConnection();
StringBuilder response = new StringBuilder();
try {
InputStream stream = conn.getInputStream();
BufferedReader in = new BufferedReader(new InputStreamReader(stream));
String str = "";
while ((str = in.readLine()) != null) {
response.append(str);
}
in.close();
System.out.println(response);
} catch(IOException err) {
System.out.println(err);
} finally {
Map<String, String> msgResponse = new HashMap<String, String>();
for (int i = 0;; i++) {
String headerName = conn.getHeaderFieldKey(i);
String headerValue = conn.getHeaderField(i);
if (headerName == null && headerValue == null) {
break;
}
msgResponse.put(headerName == null ? "Method" : headerName, headerValue);
}
System.out.println(msgResponse);
}
}
}
And if you are curious about the content of each handshake, you can find another example using jcifs and Socket on this thread.
Had to recently implement this at work hence here is updated solution with Spring's RestTemplate:
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.NTCredentials;
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.apache.http.impl.client.HttpClients;
import org.springframework.http.HttpEntity;
import org.springframework.http.ResponseEntity;
import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;
import org.springframework.web.client.RestTemplate;
import java.io.IOException;
public class Runner {
public static void main(String[] args) {
var credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(AuthScope.ANY, new NTCredentials("username", "password", "", "someDomain"));
try (var client = HttpClients.custom()
.setDefaultCredentialsProvider(credentialsProvider)
.build();) {
var requestFactory = new HttpComponentsClientHttpRequestFactory();
requestFactory.setHttpClient(client);
RestTemplate restTemplate = new RestTemplate(requestFactory);
ResponseEntity<String> stringResponseEntity = restTemplate.postForEntity("url", new HttpEntity<>("yourDtoObject"), String.class);
} catch (IOException e) {
e.printStackTrace();
}
}
}
dependencies needed are: spring-web and org.apache.httpcomponents
ps: it is important to enter username without domain otherwise it doesn't work. As in if your domain is companyName/username often people just enter that whole thing as username and what you should do is enter them separately where domain="companyName" and username="username"
Ref: https://jcifs.samba.org/src/docs/faq.html#ntlmv2
Q: Does jCIFS support NTLMv2?
A: Yes. As of 1.3.0, JCIFS fully supports NTLMv2 and uses it by default.
Note: The NTLM HTTP SSO Filter that used to be included with JCIFS cannot support NTLMv2.
Relatively from the list you gave,I would go with JCIFS.
The library is mature , and their documentation is good.
To top it off they had fairly regular releases , and the last one being Nov 2011.
Personal Experience : it was fairly easy to get started when compared to others i have tried (spnego and ntmv2auth)

Why does this HTTP servlet behave inconsistently?

An intranet site has a search form which uses AJAX to call a servlet on a different domain for search suggestions.
This works in Internet Explorer with the intranet domain being a "trusted site" and with cross-domain requests enabled for trusted sites, but doesn't work in Firefox.
I have tried to work around the problem by creating a servlet on the intranet server, so there's a JS call to my servlet on the same domain, then my servlet calls the suggestions servlet on the other domain. The cross-domain call is server-side, so it should work regardless of browser settings.
The AJAX call and my servlet's call to the other servlet both use a HTTP POST request with arguments in the URL and empty request-content.
The reason I'm sticking with POST requests is that the JS code is all in files on the search server, which I can't modify, and that code uses POST requests.
I've tried calling the customer's existing suggestions servlet with a GET request, and it produces a 404 error.
The problem is that the result is inconsistent.
I've used System.out.println calls to show the full URL and size of the result on the server log.
The output first seemed to change depending on the calling browser and/or website, but now seems to change even between sessions of the same browser.
E.g. entering "g" in the search box, I got this output from the first few tries on the Development environment using Firefox:
Search suggestion URL: http://searchdev.companyname.com.au/suggest?q=g&max=10&site=All&client=ie&access=p&format=rich
Search suggestion result length: 64
Initial tries with Firefox on the Test environment (different intranet server but same search server) produced a result length of 0 for the same search URL.
Initial tries with Internet Explorer produced a result length of 0 in both environments.
Then I tried searching for different letters, and found that "t" produced a result in IE when "g" hadn't.
After closing the browsers and leaving it for a while, I tried again and got different results.
E.g. Using Firefox and trying "g" in the Development environment now produces no result when it was previously producing one.
The inconsistency makes me think something is wrong with my servlet code, which is shown below. What could be causing the problem?
I think the search suggestions are being provided by a Google Search Appliance, and the JS files on the search server all seem to have come from Google.
The actual AJAX call is this line in one file:
XH_XmlHttpPOST(xmlhttp, url, '', handler);
The XH_XmlHttpPOST function is as follows in another file:
function XH_XmlHttpPOST(xmlHttp, url, data, handler) {
xmlHttp.open("POST", url, true);
xmlHttp.onreadystatechange = handler;
xmlHttp.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
xmlHttp.setRequestHeader("Content-Length",
/** #type {string} */ (data.length));
XH_XmlHttpSend(xmlHttp, data);
}
Here is my servlet code:
package com.companyname.theme;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Properties;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class suggest extends HttpServlet {
Properties props=null;
#Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
String result = "";
String args = req.getQueryString();
String baseURL = props.getProperty("searchFormBaseURL");
String urlStr = baseURL + "/suggest?" + args;
System.out.println("Search suggestion URL: " + urlStr);
try {
int avail, rCount;
int totalCount = 0;
byte[] ba = null;
byte[] bCopy;
URL url = new URL(urlStr);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("POST");
conn.setDoOutput(true);
OutputStream os = conn.getOutputStream();
os.write("".getBytes());
os.close();
InputStream is = conn.getInputStream();
while ((avail = is.available()) > 0) {
if (ba == null) ba = new byte[avail];
else if (totalCount + avail > ba.length) {
// Resize ba if there's more data available.
bCopy = new byte[totalCount + avail];
System.arraycopy(ba, 0, bCopy, 0, totalCount);
ba = bCopy;
bCopy = null;
}
rCount = is.read(ba, totalCount, avail);
if (rCount < 0) break;
totalCount += rCount;
}
is.close();
conn.disconnect();
result = (ba == null ? "" : new String(ba));
System.out.println("Search suggestion result length: " + Integer.toString(result.length()));
} catch(MalformedURLException e) {
e.printStackTrace();
} catch(IOException e) {
e.printStackTrace();
}
PrintWriter pw = resp.getWriter();
pw.print(result);
}
#Override
public void init() throws ServletException {
super.init();
InputStream stream = this.getClass().getResourceAsStream("/WEB-INF/lib/endeavour.properties");
props = new Properties();
try {
props.load(stream);
stream.close();
} catch (Exception e) {
// TODO: handle exception
}
}
}
Solution: don't rely on InputStream.available().
The JavaDoc for that method says it always returns 0.
HttpURLConnection.getInputStream() actually returns a HttpInputStream, in which available() seems to work but apparently sometimes returns 0 when there is more data.
I changed my read loop to not use available() at all, and now it consistently returns the expected results.
The working servlet is below.
package com.integral.ie.theme;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Properties;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class suggest extends HttpServlet implements
javax.servlet.Servlet {
Properties props=null;
#Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
//super.doPost(req, resp);
final int maxRead=200;
String result="";
String args=req.getQueryString();
String baseURL=props.getProperty("searchFormBaseURL");
String urlStr=baseURL+"/suggest?"+args;
//System.out.println("Search suggestion URL: "+urlStr);
try {
int rCount=0;
int totalCount=0;
int baLen=maxRead;
byte[] ba=null;
byte[] bCopy;
URL url=new URL(urlStr);
HttpURLConnection conn=(HttpURLConnection)url.openConnection();
conn.setRequestMethod("POST");
// Setting these properties may be unnecessary - just did it
// because the GSA javascript does it.
conn.setRequestProperty("Content-Type","application/x-www-form-urlencoded");
conn.setRequestProperty("Content-Length","0");
InputStream is=conn.getInputStream();
ba=new byte[baLen];
while (rCount>=0) {
try {
rCount=is.read(ba,totalCount,baLen-totalCount);
if (rCount>0) {
totalCount+=rCount;
if (totalCount>=baLen) {
baLen+=maxRead;
bCopy=new byte[baLen];
System.arraycopy(ba,0,bCopy,0,totalCount);
ba=bCopy;
bCopy=null;
}
}
} catch(IOException e) {
// IOException while reading - allow the method to return
// anything we've read so far.
}
}
is.close();
conn.disconnect();
result=(totalCount==0?"":new String(ba,0,totalCount));
//System.out.println("Search suggestion result length: "
//+Integer.toString(result.length()));
} catch(MalformedURLException e) {
e.printStackTrace();
} catch(IOException e) {
e.printStackTrace();
}
PrintWriter pw=resp.getWriter();
pw.print(result);
}
#Override
public void init() throws ServletException {
super.init();
InputStream stream=this.getClass().getResourceAsStream("/WEB-INF/lib/endeavour.properties");
props=new Properties();
try {
props.load(stream);
stream.close();
} catch (Exception e) {
// TODO: handle exception
}
}
}
Start with a unit test. Servlets are pretty straightforward to unit test and HttpUnit has worked for us.
Debugging Servlet code in a browser and with println calls will cost more time in the long run and it's difficult for someone on SO to digest all of that information to help you.
Also, consider using a JavaScript framework such as JQuery for your AJAX calls. In my opinion there's little reason to touch an xmlHttp object directly now that frameworks will hide that for you.

Getting the 'external' IP address in Java

I'm not too sure how to go about getting the external IP address of the machine as a computer outside of a network would see it.
My following IPAddress class only gets the local IP address of the machine.
public class IPAddress {
private InetAddress thisIp;
private String thisIpAddress;
private void setIpAdd() {
try {
InetAddress thisIp = InetAddress.getLocalHost();
thisIpAddress = thisIp.getHostAddress().toString();
} catch (Exception e) {
}
}
protected String getIpAddress() {
setIpAdd();
return thisIpAddress;
}
}
I am not sure if you can grab that IP from code that runs on the local machine.
You can however build code that runs on a website, say in JSP, and then use something that returns the IP of where the request came from:
request.getRemoteAddr()
Or simply use already-existing services that do this, then parse the answer from the service to find out the IP.
Use a webservice like AWS and others
import java.net.*;
import java.io.*;
URL whatismyip = new URL("http://checkip.amazonaws.com");
BufferedReader in = new BufferedReader(new InputStreamReader(
whatismyip.openStream()));
String ip = in.readLine(); //you get the IP as a String
System.out.println(ip);
One of the comments by #stivlo deserves to be an answer:
You can use the Amazon service http://checkip.amazonaws.com
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
public class IpChecker {
public static String getIp() throws Exception {
URL whatismyip = new URL("http://checkip.amazonaws.com");
BufferedReader in = null;
try {
in = new BufferedReader(new InputStreamReader(
whatismyip.openStream()));
String ip = in.readLine();
return ip;
} finally {
if (in != null) {
try {
in.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The truth is: 'you can't' in the sense that you posed the question. NAT happens outside of the protocol. There is no way for your machine's kernel to know how your NAT box is mapping from external to internal IP addresses. Other answers here offer tricks involving methods of talking to outside web sites.
All this are still up and working smoothly! (as of 10 Feb 2022)
http://checkip.amazonaws.com/
https://ipv4.icanhazip.com/
http://myexternalip.com/raw
http://ipecho.net/plain
http://www.trackip.net/ip
http://bot.whatismyipaddress.com (10 Feb 2022)
http://curlmyip.com/ (17 Dec 2016)
Piece of advice: Do not direcly depend only on one of them; try to use one but have a contigency plan considering others! The more you use, the better!
Good luck!
As #Donal Fellows wrote, you have to query the network interface instead of the machine. This code from the javadocs worked for me:
The following example program lists all the network interfaces and their addresses on a machine:
import java.io.*;
import java.net.*;
import java.util.*;
import static java.lang.System.out;
public class ListNets {
public static void main(String args[]) throws SocketException {
Enumeration<NetworkInterface> nets = NetworkInterface.getNetworkInterfaces();
for (NetworkInterface netint : Collections.list(nets))
displayInterfaceInformation(netint);
}
static void displayInterfaceInformation(NetworkInterface netint) throws SocketException {
out.printf("Display name: %s\n", netint.getDisplayName());
out.printf("Name: %s\n", netint.getName());
Enumeration<InetAddress> inetAddresses = netint.getInetAddresses();
for (InetAddress inetAddress : Collections.list(inetAddresses)) {
out.printf("InetAddress: %s\n", inetAddress);
}
out.printf("\n");
}
}
The following is sample output from the example program:
Display name: TCP Loopback interface
Name: lo
InetAddress: /127.0.0.1
Display name: Wireless Network Connection
Name: eth0
InetAddress: /192.0.2.0
From docs.oracle.com
Make a HttpURLConnection to some site like www.whatismyip.com and parse that :-)
How about this? It's simple and worked the best for me :)
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.MalformedURLException;
import java.net.URL;
public class IP {
public static void main(String args[]) {
new IP();
}
public IP() {
URL ipAdress;
try {
ipAdress = new URL("http://myexternalip.com/raw");
BufferedReader in = new BufferedReader(new InputStreamReader(ipAdress.openStream()));
String ip = in.readLine();
System.out.println(ip);
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
http://jstun.javawi.de/ will do it - provided your gateway device does STUN )most do)
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.*;
import java.util.regex.Pattern;
public class ExternalIPUtil {
private static final Pattern IPV4_PATTERN = Pattern.compile("^(([01]?\\d\\d?|2[0-4]\\d|25[0-5])\\.){3}([01]?\\d\\d?|2[0-4]\\d|25[0-5])$");
private static final String[] IPV4_SERVICES = {
"http://checkip.amazonaws.com/",
"https://ipv4.icanhazip.com/",
"http://bot.whatismyipaddress.com/"
// and so on ...
};
public static String get() throws ExecutionException, InterruptedException {
List<Callable<String>> callables = new ArrayList<>();
for (String ipService : IPV4_SERVICES) {
callables.add(() -> get(ipService));
}
ExecutorService executorService = Executors.newCachedThreadPool();
try {
return executorService.invokeAny(callables);
} finally {
executorService.shutdown();
}
}
private static String get(String url) throws IOException {
try (BufferedReader in = new BufferedReader(new InputStreamReader(new URL(url).openStream()))) {
String ip = in.readLine();
if (IPV4_PATTERN.matcher(ip).matches()) {
return ip;
} else {
throw new IOException("invalid IPv4 address: " + ip);
}
}
}
public static void main(String[] args) throws ExecutionException, InterruptedException {
System.out.println("IP: " + get());
}
}
Get from multiple IP services concurrently such as:
http://checkip.amazonaws.com/
https://ipv4.icanhazip.com/
http://bot.whatismyipaddress.com/
and so on ...
and ExecutorService.invokeAny(tasks) return the result of the first successfully thread. Other tasks that have not completed will be cancelled.
It's not that easy since a machine inside a LAN usually doesn't care about the external IP of its router to the internet.. it simply doesn't need it!
I would suggest you to exploit this by opening a site like http://www.whatismyip.com/ and getting the IP number by parsing the html results.. it shouldn't be that hard!
If you are using JAVA based webapp and if you want to grab the client's (One who makes the request via a browser) external ip try deploying the app in a public domain and use request.getRemoteAddr() to read the external IP address.
System.out.println(pageCrawling.getHtmlFromURL("http://ipecho.net/plain"));
An alternative solution is to execute an external command, obviously, this solution limits the portability of the application.
For example, for an application that runs on Windows, a PowerShell command can be executed through jPowershell, as shown in the following code:
public String getMyPublicIp() {
// PowerShell command
String command = "(Invoke-WebRequest ifconfig.me/ip).Content.Trim()";
String powerShellOut = PowerShell.executeSingleCommand(command).getCommandOutput();
// Connection failed
if (powerShellOut.contains("InvalidOperation")) {
powerShellOut = null;
}
return powerShellOut;
}

Categories

Resources