Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Can anyone tell me where to find a Java library that does Muslim prayer time calculation based on the city? Maybe a web service? I would prefer a Java library.
http://www.javafr.com/codes/PRAYER-ALERT-SALA_40601.aspx
hello, it's not a library but a student project. But quite well executed.
Or : http://www.directionsmag.com/article.php?article_id=2956
A method to calcul prayer time with google map+local time ( in short : check where you are, check how far your are from the Makkah , apply the right formula )
You can make use of the PrayTimes library.
http://praytimes.org/code/
It's originally written in JavaScript, but there are ports to other languages as well.
I think Ubuntu Muslim Edition uses this type of features in a calendar, maybe you could search around there
Best is https://github.com/abodehq/Pray-Times for this question. It includes java, objective C,PHP, C#, javascript, python, c++ code implementation. It supports different methods and calculation is based gps position.Calculation Methods are Ithna Ashari,University of Islamic Sciences, Karachi, Islamic Society of North America (ISNA),Muslim World League (MWL),Umm al-Qura, Makkah,Egyptian General Authority of Survey,Institute of Geophysics, University of Tehran,Custom Setting. Juristic Methods are Shafii, Hanafi. Adjusting Methods for Higher Latitudes. Formats are 24-hour format/12-hour format/12-hour format with no suffix/floating point number.
I think that is come out port of ITL for Java on sourceforge.net
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I have 2 arrays containing time and voltage. I would like to convert time domain to frequency domain in Java. I would like to use FFT. If there is any open source library I could use, please point me to it. I have done a research and found few algorithms but they are asking for real part and imaginary part. If anyone got idea regarding that, please let me know how I could use that in my context.
Code I have found so far
Here is one library:
http://www.fftw.org/download.html
You can also use R with Java. See this link:
Java-R integration?
If you are not familiar with R check their home page r-project dot org (I can't post more links)
While I haven't checked the implementation you link to, you should be able to use that one by suppling 0s for the imaginary part. In that case you are going "forward", i.e. set DIRECT to true transforming from time-domain to the frequency domain. The function will return an array containing real parts of the frequency in even numbered seats, and the imaginary part in odd numbered.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I'm an Android Developer and as part of my next app I will need to evaluate a large variety of user created mathematical expressions and equations. I am looking for a good java library that is lightweight and can evaluate mathematical expressions using user defined variables and constants, trig and exponential functions, etc.
I've looked around and Jep seems to be popular, but I would like to hear more suggestions, especially from people who have used these libraries before.
JEval is a good alternative. I abandoned Jep due to it becoming commercial. The only concern is that JEval seems to be a little dormant at the moment (last release in 2008).
I wrote a simple but capable Math Expression Evaluator a while back, which is free and open-source. It's main advantage is being fast and tiny - both are a good thing with hand-held devices. If it meets your need you are welcome to use it.
Primary Features:
Basic math operators, with inferred precedence (^ * × · / ÷ % + -).
Explicit precedence with parenthesis.
Implicit multiplication of bracketed subexpressions.
Correct right-associativity of exponentials (power operator).
Direct support for hexadecimal numbers prefixed by 0x.
Constants and variables.
Extensible operators.
Extensible functions.
20 KiB footprint.
Example
MathEval math=new MathEval();
math.setVariable("Top", 5);
math.setVariable("Left", 20);
math.setVariable("Bottom",15);
math.setVariable("Right", 60);
System.out.println("Middle: "+math.evaluate("floor((Right+1-Left)/2)"));
Try https://code.google.com/p/expressionoasis/. It is an extensible Expression Evaluation framework and will meet such requirements.
This doesn't exactly fit my initial conditions, but I found a wonderful parser written in C++. I'm trying to figure out Android's Native code support to see if I can use it. It's exactly what I need.
Here's the documentation for the project.
http://www.codeproject.com/KB/recipes/MathieuMathParser.aspx
There is a new commercial tool called formula4j, which may be of interest to some.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I've Googled, StackOverflowed, everything, and I cannot seem to find a tutorial I can understand. I understand the concept of genetic algorithms, and how to implement them, (Though I haven't tried) but I cannot grasp the concept of neural networks.
I know vaguely how they work... And that's about it. Could someone direct me to a tutorial that could help someone who has not even graduated middle school yet? Sure, I'm several years ahead of the majority of people my grade, but I don't understand summation, (which I apparently need if I don't want a simple binary output) vectors, and other things that I apparently should know.
Is there a simple, bare-bones tutorial for neural networks? After I learn the basics, I'll proceed to more difficult ones. Preferably, they would be in Java.
Thanks!
Summation is just adding up a bunch of things. So,
Summation(1,2,3,4,5) = 1+2+3+4+5 = 15
(note: it's always adding: if you want to subtract, do a summation with negative numbers)
That was easy, right? ;)
A vector is an ordered tuple, which really just means it's bunch of numbers in a specific order. Most often seen in physics to describe position, force, velocity, etc... it's really nothing special, just some ordered numbers, where the ordering is significant:
v = <1,2,3>
If we are talking about geometry, then this vector represents a point in 3-dimensional space where the x coordinate is 1, the y coordinate is 2, and the z coordinate is 3 (See that was easy too, right)?
In neural nets, the vector is usually the vector of inputs to a neuron, so it's really just a list of numeric values. The summation of the vector would be nothing more than adding up all of the values in the vector and getting a single number as a result (which may be referred to as as "scalar" value).
(this was rushed and simplified - I'm sure someone else will help me refine it ;) )
PS. Kudos to you for diving into this stuff at the middle school level! :)
Well, there is this article in Wikipedia's Simple English, but I think you know already all that it contains.
I've had the same problem for a while. I'm a high school student, so you're a little ahead of me. I got a vacation and I used it to learn all I could on backpropagation, and I've found almost no resources that really help too much unless you want to read so much calculus that you want to die. My advice is to first write a perceptron, which is a network with only input layers and output layers. This inspired me o write a post, so hopefully within half an hour of my posting here there should be a tutorial on http://certioraomnia.blogspot.com/. It may be a little late as this question was asked three years ago, but it may help others later.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am working on a Natural Language parser which examines a sentence in english and extracts some information like name, date etc.
for example: "Lets meet next tuesday at 5 PM at the beach."
So the output will be something like : "Lets meet 15/09/2009 at 1700 hr at the beach"
So basically, what i want to know is that is there any framework or library available for JAVA to do these kind of operations like parsing dates from a sentence and give a output with some specified format.
Regards,
Pranav
Thanks for the replies. I have looked on few NLPs like LingPipe, OpenPL, Stanford NLP. I wanted to ask do they hav anything for date parsing for java.
Natty is a really good replacement for JChronic.
You can use JChronic, the Java port of Chronic.
Have you tried jchronic? However, I doubt any library could directly work with sentences: you'd have to extract sentence fragments and feeding them to a NLP date parsing framework yourself, perhaps on a trial-n-error basis (larger and larger fragments until the framework throws an error).
I don't think there's any framework out there that does that out of the box. What you can do is create a set of regular expressions to match those patterns.
I would suggest using UIMA with OpenNLP connectors and same hand made regexp rules.
I wrote a NLP script in Python's NLTK and fed the results to Ruby's chronic.
For my use case, I had more luck with chrono-java - sadly it looks stale and is not available in any Maven repository (also not via https://jitpack.io/ since the build is broken), so you have to fix and build it for yourself.
However, checking out the code and fixing a dependency (maven-javadoc-plugin was missing groupId and I updated the version), allowed me to build and run a simple example successfully:
List<ParsedResult> results = Chrono.Parse("Datum Freitag, 08.04. bis einschl. Sonntag 10.04.2016");
results.forEach(result -> System.out.println(result));
resulted in 2 Dates being extracted:
ParsedResult: " 08.04" > 04/08/2018 12:00
ParsedResult: "10.04.2016" > 04/10/2016 12:00
Pretty old question bur PrettyTime::NLP is another option to try
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I used tf/idf to calculate consine similarity between two documents. It has some limitation and does not perform very well.
I looked for LDA (latent dirichlet allocation) to calculate document similarity. I don't know
much about this. I couldn't find much stuff too about my problem.
Can you please provide me any tutorial related to my problem? Or can you give some advices how can i achive this task with LDA???
Thanks
P.S: also is there any source code availabe to perform such task with LDA??
Have you had a look at Lucene and Mahout?
This might be useful - Latent Dirichlet Allocation with Lucene and Mahout.
You might be thinking of LSA (Latent Semantic Analysis) which is a very common solution to this kind of problem.
A bit old, but for anyone still interested, take a look at this blog post (disclaimer: this is my own blog). The algorithm described there and the linked code will probably do what you need if you don't have your heart set on any specific approach.
Regarding Shashikant's comment, the cosine similarity may not be a good option because the signatures are proportional in length to the documents. Constant length signatures are preferable.
Try this service for computing cosine similarity between two documents
http://www.scurtu.it/documentSimilarity.html
import urllib,urllib2
import json
API_URL="http://www.scurtu.it/apis/documentSimilarity"
inputDict={}
inputDict['doc1']='Document with some text'
inputDict['doc2']='Other document with some text'
params = urllib.urlencode(inputDict)
f = urllib2.urlopen(API_URL, params)
response= f.read()
responseObject=json.loads(response)
print responseObject