Java/groovy library to solve for a variable algebraically - java

If I have an equation y=3x I can use "algebra" to make the equation x=y/3. Is there something I can give like:
def y = {x-> x*3}
def x = ThisWouldBeNice.solveForMe(y, 'x') //does the same as: def x = {y-> y/3}
I thought JScience could do this, but I can't seem to figure it out if its there.

I think WolframAlpha API can do that, try something like that in the explorer http://products.wolframalpha.com/api/explorer.html
y = 3x; y=6; x?

Some time ago I did something similar (actually a quick & dirty test) with Wolfram Alpha API as denis.solonenko suggests. To achieve it, use "f = 3*x", as the api response is a bit different and includes the solution for the variable x.
This code -includes a copy of the response- (use a correct API id to get it directly) solves it.
import java.net.URLEncoder
def stringEquation = "f = 3 * x"
def equation = { x -> 3*x }
//def response = "http://api.wolframalpha.com/v2/query?appid=xxx&input=" + URLEncoder.encode(stringEquation) + "&format=plaintext".toURL().text
def response= """<?xml version='1.0' encoding='UTF-8'?>
<queryresult success='true'
error='false'
numpods='6'
datatypes='Geometry'
timedout=''
timing='0.766'
parsetiming='0.181'
parsetimedout='false'
recalculate=''
id='MSP6141a0482cfc786eibg000036bb0i3bhf6d6aih&s=50'
related='http://www4a.wolframalpha.com/api/v2/relatedQueries.jsp?id=MSP6151a0482cfc786eibg00005a4g16ee5ei232bf&s=50'
version='2.1'>
<pod title='Input'
scanner='Identity'
id='Input'
position='100'
error='false'
numsubpods='1'>
<subpod title=''>
<plaintext>f = 3 x</plaintext>
</subpod>
</pod>
<pod title='Geometric figure'
scanner='Geometry'
id='GeometricFigure (ofBoundary)'
position='200'
error='false'
numsubpods='1'>
<subpod title=''>
<plaintext>line</plaintext>
</subpod>
<states count='1'>
<state name='Properties'
input='GeometricFigure (ofBoundary)__Properties' />
</states>
</pod>
<pod title='Plot'
scanner='Plotter'
id='Plot'
position='300'
error='false'
numsubpods='1'>
<subpod title=''>
<plaintext></plaintext>
</subpod>
</pod>
<pod title='Alternate form'
scanner='Simplification'
id='AlternateForm'
position='400'
error='false'
numsubpods='1'>
<subpod title=''>
<plaintext>f-3 x = 0</plaintext>
</subpod>
</pod>
<pod title='Solution for the variable x'
scanner='Reduce'
id='SolutionForTheVariableV'
position='500'
error='false'
numsubpods='1'
primary='true'>
<subpod title=''
primary='true'>
<plaintext>x = f/3</plaintext>
</subpod>
</pod>
<pod title='Implicit derivatives'
scanner='ImplicitDifferentiation'
id='ImplicitDerivatives'
position='600'
error='false'
numsubpods='2'>
<subpod title=''>
<plaintext>(dx(f))/(df) = 1/3</plaintext>
</subpod>
<subpod title=''>
<plaintext>(df(x))/(dx) = 3</plaintext>
</subpod>
<states count='1'>
<state name='More'
input='ImplicitDerivatives__More' />
</states>
</pod>
</queryresult>"""
def queryresult = new XmlSlurper().parseText(response)
def solution = queryresult.pod.findAll { it.#title.text() == "Solution for the variable x" }.toString()
It returns: x = f/3
NOTE: check print the closure definition/source in Groovy if you need to deal with the actual closure code.

Related

SVG how to get absolute coordinate for transform with translate and rotate in Java

Currently I am using batik library for SVG manipulation.
Below is my element :
<g id="FrontRight_1">
<path fill="none" stroke="#000000" stroke-miterlimit="10" d="M335.775,3550.251l886.631,198.611
c0,0-17.461-467.823-190.26-756.729c-172.794-288.907-243.793-385.207-281.208-529.649
c-37.417-144.446-65.087-385.58-29.396-544.164l-431.396-161.81c0,0-100.354,225.707-99.356,607.059
c0.598,228.37,83.309,362.999,112.561,620.65C323.833,3164.645,338.045,3532.962,335.775,3550.251z"/>
Its Original Coordinate is : 190.7826,1756.51
After transform function as follow:
<g id="FrontRight_1" transform="translate(-1751.828610 1227.40240) rotate(270.0)"> <path d="M335.775 3550.251 L1222.406 3748.862 L1222.406 3748.862 L1204.945 3281.039 L1032.146 2992.1329 L859.352 2703.226 L788.3529 2606.926 L750.938 2462.4841 L713.521 2318.038 L685.851 2076.904 L721.542 1918.3201 L290.146 1756.51 L290.146 1756.51 L189.7919 1982.217 L190.79 2363.569 L191.388 2591.939 L274.099 2726.568 L303.3509 2984.2192 L323.833 3164.645 L338.045 3532.962 L335.775 3550.251 Z" fill="#8498d1" stroke="#010101" stroke-width="1" /> </g>
Need to know new absolute coordinate of above element
Thank you
Batik supports the SVG DOM, so the method should be the same as it would be for the Javascript example below.
var g = document.getElementById("FrontRight_1");
var svg = g.ownerSVGElement;
var pt = svg.createSVGPoint();
pt.x = 190.7826;
pt.y = 1756.51;
var groupTransformMatrix = g.transform.baseVal.consolidate().matrix;
pt = pt.matrixTransform(groupTransformMatrix);
console.log("pt=",pt);
<svg>
<g id="FrontRight_1" transform="translate(-1751.828610 1227.40240) rotate(270.0)">
</g>
</svg>
It should just be a matter of porting the code to Java. I'm not a Batik user, but it should be something like the following:
SVGDocument document = canvas.getSVGDocument();
SVGGElement g = (SVGGElement) document.getElementById("FrontRight_1");
SVGSVGElement svg = g.getOwnerSVGElement();
SVGPoint pt = svg.createSVGPoint();
pt.setX(190.7826f);
pt.setY(1756.51f);
SVGMatrix groupTransformMatrix = g.getTransform().getBaseVal().consolidate().getMatrix();
pt = pt.matrixTransform(groupTransformMatrix);

Heatmap/Contours based on Transportation Time (Reverse Isochronic Contours)

Note: Solutions in r, python, java, or if necessary, c++ or c# are desired.
I am trying to draw contours based on transportation time. To be more clear, I want to cluster the points which have similar travel time (let's say 10 minute interval) to a specific point (destination) and map them as contours or a heatmap.
Right now, the only idea that I have is using R package gmapsdistance to find the travel time for different origins and then cluster them and draw them on a map. But, as you can tell, this is in no way a robust solution.
This thread on GIS-community and this one for python illustrate a similar problem but for an origin to destinations within reach in specific time. I want to find origins which I can travel to the destination within certain time.
Right now, the code below shows my rudimentary idea (using R):
library(gmapsdistance)
set.api.key("YOUR.API.KEY")
mdestination <- "40.7+-73"
morigin1 <- "40.6+-74.2"
morigin2 <- "40+-74"
gmapsdistance(origin = morigin1,
destination = mdestination,
mode = "transit")
gmapsdistance(origin = morigin2,
destination = mdestination,
mode = "transit")
This map also may help to understand the question:
Using this answer I can get the points which I can go to from a point of origin but I need to reverse it and find the points which have travel time equal-less-than a certain time to my destination;
library(httr)
library(googleway)
library(jsonlite)
appId <- "TravelTime_APP_ID"
apiKey <- "TravelTime_API_KEY"
mapKey <- "GOOGLE_MAPS_API_KEY"
location <- c(40, -73)
CommuteTime <- (5 / 6) * 60 * 60
url <- "http://api.traveltimeapp.com/v4/time-map"
requestBody <- paste0('{
"departure_searches" : [
{"id" : "test",
"coords": {"lat":', location[1], ', "lng":', location[2],' },
"transportation" : {"type" : "driving"} ,
"travel_time" : ', CommuteTime, ',
"departure_time" : "2017-05-03T07:20:00z"
}
]
}')
res <- httr::POST(url = url,
httr::add_headers('Content-Type' = 'application/json'),
httr::add_headers('Accept' = 'application/json'),
httr::add_headers('X-Application-Id' = appId),
httr::add_headers('X-Api-Key' = apiKey),
body = requestBody,
encode = "json")
res <- jsonlite::fromJSON(as.character(res))
pl <- lapply(res$results$shapes[[1]]$shell, function(x){
googleway::encode_pl(lat = x[['lat']], lon = x[['lng']])
})
df <- data.frame(polyline = unlist(pl))
df_marker <- data.frame(lat = location[1], lon = location[2])
google_map(key = mapKey) %>%
add_markers(data = df_marker) %>%
add_polylines(data = df, polyline = "polyline")
Moreover, Documentation of Travel Time Map Platform talks about Multi Origins with Arrival time which is exactly the thing I want to do. But I need to do that for both public transportation and driving (for places with less than an hour commute time) and I think since public transport is tricky (based on what station you are close to) maybe heatmap is a better option than contours.
This answer is based on obtaining an origin-destination matrix between a grid of (roughly) equally distant points. This is a computer intensive operation not only because it requires a good number of API calls to mapping services, but also because the servers must calculate a matrix for each call. The number of required calls grows exponentially along the number of points in the grid.
To tackle this problem, I would suggest that you consider running on your local machine or on a local server a mapping server. Project OSRM offers a relatively simple, free, and open-source solution, enabling you to run an OpenStreetMap server into a Linux docker (https://github.com/Project-OSRM/osrm-backend). Having your own local mapping server will allow you to make as many API calls as you desire. R's osrm package allows you to interact with OpenStreetMaps' APIs, Including those placed to a local server.
library(raster) # Optional
library(sp)
library(ggmap)
library(tidyverse)
library(osrm)
devtools::install_github("cmartin/ggConvexHull") # Needed to quickly draw the contours
library(ggConvexHull)
I create a grid of 96 roughly equally distant points around Bruxelles (Belgium) conurbation.
This grid does not take into consideration the earths curvature, which is negligible at the level of city distances.
For convenience, I employ the raster package to download a ShapeFile of Belgium and extract the nodes for Brussels city.
BE <- raster::getData("GADM", country = "BEL", level = 1)
Bruxelles <- BE[BE$NAME_1 == "Bruxelles", ]
df_grid <- makegrid(Bruxelles, cellsize = 0.02) %>%
SpatialPoints() %>%
## I convert the SpatialPoints object into a simple data.frame
as.data.frame() %>%
## create a unique id for each point in the data.frame
rownames_to_column() %>%
## rename variables of the data.frame with more explanatory names.
rename(id = rowname, lat = x2, lon = x1)
## I point osrm.server to the OpenStreet docker running in my Linux machine. ...
### ... Do not run this if you are getting your data from OpenStreet public servers.
options(osrm.server = "http://127.0.0.1:5000/")
## I obtain a list with distances (Origin Destination Matrix in ...
### ... minutes, origins and destinations)
Distance_Tables <- osrmTable(loc = df_grid)
OD_Matrix <- Distance_Tables$durations %>% ## subset the previous list
## convert the Origin Destination Matrix into a tibble
as_data_frame() %>%
rownames_to_column() %>%
## make sure we have an id column for the OD tibble
rename(origin_id = rowname) %>%
## transform the tibble into long/tidy format
gather(key = destination_id, value = distance_time, -origin_id) %>%
left_join(df_grid, by = c("origin_id" = "id")) %>%
## set origin coordinates
rename(origin_lon = lon, origin_lat = lat) %>%
left_join(df_grid, by = c("destination_id" = "id")) %>%
## set destination coordinates
rename(destination_lat = lat, destination_lon = lon)
## Obtain a nice looking road map of Brussels
Brux_map <- get_map(location = "bruxelles, belgique",
zoom = 11,
source = "google",
maptype = "roadmap")
ggmap(Brux_map) +
geom_point(aes(x = origin_lon, y = origin_lat),
data = OD_Matrix %>%
## Here I selected point_id 42 as the desired target, ...
## ... just because it is not far from the City Center.
filter(destination_id == 42),
size = 0.5) +
## Draw a diamond around point_id 42
geom_point(aes(x = origin_lon, y = origin_lat),
data = OD_Matrix %>%
filter(destination_id == 42, origin_id == 42),
shape = 5, size = 3) +
## Countour marking a distance of up to 8 minutes
geom_convexhull(alpha = 0.2,
fill = "blue",
colour = "blue",
data = OD_Matrix %>%
filter(destination_id == 42,
distance_time <= 8),
aes(x = origin_lon, y = origin_lat)) +
## Countour marking a distance of up to 16 minutes
geom_convexhull(alpha = 0.2,
fill = "red",
colour = "red",
data = OD_Matrix %>%
filter(destination_id == 42,
distance_time <= 15),
aes(x = origin_lon, y = origin_lat))
Results
The blue contour represent distances to the city center of up to 8 minutes.
The red contour represent distances of up to 15 minutes.
I came up with an approach that would be applicable comparing to making numerous api calls.
The idea is finding the places you can reach in certain time(look at this thread). Traffic can be simulated by changing the time from morning to evening. You will end up with an overlapped area which you can reach from both places.
Then you can use Nicolas answer and map some points within that overlapped area and draw the heat map for the destinations you have. This way, you will have less area (points) to cover and therefore you will make much less api calls (remember to use appropriate time for that matter).
Below, I tried to demonstrate what I mean by these and get you to the point that you can make the grid mentioned in the other answer to make your estimation more robust.
This shows how to map the intersected area.
library(httr)
library(googleway)
library(jsonlite)
appId <- "Travel.Time.ID"
apiKey <- "Travel.Time.API"
mapKey <- "Google.Map.ID"
locationK <- c(40, -73) #K
locationM <- c(40, -74) #M
CommuteTimeK <- (3 / 4) * 60 * 60
CommuteTimeM <- (0.55) * 60 * 60
url <- "http://api.traveltimeapp.com/v4/time-map"
requestBodyK <- paste0('{
"departure_searches" : [
{"id" : "test",
"coords": {"lat":', locationK[1], ', "lng":', locationK[2],' },
"transportation" : {"type" : "public_transport"} ,
"travel_time" : ', CommuteTimeK, ',
"departure_time" : "2018-06-27T13:00:00z"
}
]
}')
requestBodyM <- paste0('{
"departure_searches" : [
{"id" : "test",
"coords": {"lat":', locationM[1], ', "lng":', locationM[2],' },
"transportation" : {"type" : "driving"} ,
"travel_time" : ', CommuteTimeM, ',
"departure_time" : "2018-06-27T13:00:00z"
}
]
}')
resKi <- httr::POST(url = url,
httr::add_headers('Content-Type' = 'application/json'),
httr::add_headers('Accept' = 'application/json'),
httr::add_headers('X-Application-Id' = appId),
httr::add_headers('X-Api-Key' = apiKey),
body = requestBodyK,
encode = "json")
resMi <- httr::POST(url = url,
httr::add_headers('Content-Type' = 'application/json'),
httr::add_headers('Accept' = 'application/json'),
httr::add_headers('X-Application-Id' = appId),
httr::add_headers('X-Api-Key' = apiKey),
body = requestBodyM,
encode = "json")
resK <- jsonlite::fromJSON(as.character(resKi))
resM <- jsonlite::fromJSON(as.character(resMi))
plK <- lapply(resK$results$shapes[[1]]$shell, function(x){
googleway::encode_pl(lat = x[['lat']], lon = x[['lng']])
})
plM <- lapply(resM$results$shapes[[1]]$shell, function(x){
googleway::encode_pl(lat = x[['lat']], lon = x[['lng']])
})
dfK <- data.frame(polyline = unlist(plK))
dfM <- data.frame(polyline = unlist(plM))
df_markerK <- data.frame(lat = locationK[1], lon = locationK[2], colour = "#green")
df_markerM <- data.frame(lat = locationM[1], lon = locationM[2], colour = "#lavender")
iconK <- "red"
df_markerK$icon <- iconK
iconM <- "blue"
df_markerM$icon <- iconM
google_map(key = mapKey) %>%
add_markers(data = df_markerK,
lat = "lat", lon = "lon",colour = "icon",
mouse_over = "K_K") %>%
add_markers(data = df_markerM,
lat = "lat", lon = "lon", colour = "icon",
mouse_over = "M_M") %>%
add_polygons(data = dfM, polyline = "polyline", stroke_colour = '#461B7E',
fill_colour = '#461B7E', fill_opacity = 0.6) %>%
add_polygons(data = dfK, polyline = "polyline",
stroke_colour = '#F70D1A',
fill_colour = '#FF2400', fill_opacity = 0.4)
You can extract the intersected area like this:
# install.packages(c("rgdal", "sp", "raster","rgeos","maptools"))
library(rgdal)
library(sp)
library(raster)
library(rgeos)
library(maptools)
Kdata <- resK$results$shapes[[1]]$shell
Mdata <- resM$results$shapes[[1]]$shell
xyfunc <- function(mydf) {
xy <- mydf[,c(2,1)]
return(xy)
}
spdf <- function(xy, mydf){
sp::SpatialPointsDataFrame(
coords = xy, data = mydf,
proj4string = CRS("+proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0"))}
for (i in (1:length(Kdata))) {Kdata[[i]] <- xyfunc(Kdata[[i]])}
for (i in (1:length(Mdata))) {Mdata[[i]] <- xyfunc(Mdata[[i]])}
Kshp <- list(); for (i in (1:length(Kdata))) {Kshp[i] <- spdf(Kdata[[i]],Kdata[[i]])}
Mshp <- list(); for (i in (1:length(Mdata))) {Mshp[i] <- spdf(Mdata[[i]],Mdata[[i]])}
Kbind <- do.call(bind, Kshp)
Mbind <- do.call(bind, Mshp)
#plot(Kbind);plot(Mbind)
x <- intersect(Kbind,Mbind)
#plot(x)
xdf <- data.frame(x)
xdf$icon <- "https://i.stack.imgur.com/z7NnE.png"
google_map(key = mapKey,
location = c(mean(latmax,latmin), mean(lngmax,lngmin)), zoom = 8) %>%
add_markers(data = xdf, lat = "lat", lon = "lng", marker_icon = "icon")
This is just an illustration of the intersected area.
Now, You can get the coordinates from xdf dataframe and construct your grid around those points to finally come up with a heat map. To respect the other user who came up with that idea/answer I am not including it in mine and am just referencing to it.
Nicolás Velásquez - Obtaining an Origin-Destination Matrix between a Grid of (Roughly) Equally Distant Points

Tensorflow in Android: How do i use my linear regression model to predict a value in an android application?

I currently have a ipynb file (ipython notebook) that contains a linear regression code / model(im not entirely sure if it's a model) that I've created earlier.
How do i implement this model in an android application such that if I were to input a value of 'x' in a text box, it'll output in a textview the predicted value of 'y'. Function: Y = mx + b.
I've tried looking at different tutorials, but they were mostly not "step-by-step" guides, which made it really hard to understand, I'm a beginner at coding.
Here's my code for the model:
import tensorflow as tf
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
rng = np.random
from numpy import genfromtxt
from sklearn.datasets import load_boston
# Parameters
learning_rate = 0.0001
training_epochs = 1000
display_step = 50
n_samples = 222
X = tf.placeholder("float") # create symbolic variables
Y = tf.placeholder("float")
filename_queue = tf.train.string_input_producer(["C://Users//Shiina//battdata.csv"],shuffle=False)
reader = tf.TextLineReader() # skip_header_lines=1 if csv has headers
key, value = reader.read(filename_queue)
# Default values, in case of empty columns. Also specifies the type of the
# decoded result.
record_defaults = [[1.], [1.]]
height, soc= tf.decode_csv(
value, record_defaults=record_defaults)
features = tf.stack([height])
# Set model weights
W = tf.Variable(rng.randn(), name="weight")
b = tf.Variable(rng.randn(), name="bias")
# Construct a linear model
pred_soc = tf.add(tf.multiply(height, W), b) # XW + b <- y = mx + b where W is gradient, b is intercept
# Mean squared error
cost = tf.reduce_sum(tf.pow(pred_soc-soc, 2))/(2*n_samples)
# Gradient descent
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
# Initializing the variables
init = tf.global_variables_initializer()
with tf.Session() as sess:
# Start populating the filename queue.
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(coord=coord)
sess.run(init)
# Fit all training data
for epoch in range(training_epochs):
_, cost_value = sess.run([optimizer,cost])
#Display logs per epoch step
if (epoch+1) % display_step == 0:
c = sess.run(cost)
print( "Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \
"W=", sess.run(W), "b=", sess.run(b))
print("Optimization Finished!")
training_cost = sess.run(cost)
print ("Training cost=", training_cost, "W=", sess.run(W), "b=", sess.run(b), '\n')
#Plot data after completing training
train_X = []
train_Y = []
for i in range(n_samples): #Your input data size to loop through once
X, Y = sess.run([height, pred_soc]) # Call pred, to get the prediction with the updated weights
train_X.append(X)
train_Y.append(Y)
#Graphic display
plt.plot(train_X, train_Y, 'ro', label='Original data')
plt.ylabel("SoC")
plt.xlabel("Height")
plt.axis([0, 1, 0, 100])
plt.plot(train_X, train_Y, linewidth=2.0)
plt.legend()
plt.show()
coord.request_stop()
coord.join(threads)

SoapUI: response with CDATA is giving null. searched various article but no luck

Hello I have below SOAP reponse.
`
<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<soap:Body>
<GetWeatherResponse xmlns="http://www.webserviceX.NET">
<GetWeatherResult><![CDATA[<?xml version="1.0" encoding="utf-16"?>
<CurrentWeather>
<Location>Cape Town, Cape Town International Airport, South Africa (FACT) 33-59S 018-36E 0M</Location>
<Time>Jun 04, 2016 - 05:00 AM EDT / 2016.06.04 0900 UTC</Time>
<Wind> from the SE (130 degrees) at 21 MPH (18 KT):0</Wind>
<Visibility> greater than 7 mile(s):0</Visibility>
<SkyConditions> mostly clear</SkyConditions>
<Temperature> 60 F (16 C)</Temperature>
<DewPoint> 44 F (7 C)</DewPoint>
<RelativeHumidity> 55%</RelativeHumidity>
<Pressure> 30.39 in. Hg (1029 hPa)</Pressure>
<Status>Success</Status>
</CurrentWeather>]]></GetWeatherResult>
</GetWeatherResponse>
</soap:Body>
</soap:Envelope>"`
for the following request: http://www.webservicex.net/globalweather.asmx.
I want to read XML Data in above response using script. but it is giving null.
I have tried script as below
def groovyUtils = new com.eviware.soapui.support.GroovyUtils(context)
def holder = groovyUtils.getXmlHolder(messageExchange.responseContent)
holder.namespaces["ns"] = "http://www.webserviceX.NET/" def
weatherinfo= holder.getNodeValue("//ns:GetWeatherResult/text()")
log.info weatherinfo
bit Instead of getting above reponse I am getting NULL. I have read the SOAPUI documentation on CDATA but its not working.
I got answer. It was just one extra slash in namespace that was giving me null.
Both the below scripts are working now.
> def respXmlHolder = new
> com.eviware.soapui.support.XmlHolder(messageExchange.getResponseContentAsXml())
> respXmlHolder.namespaces["ns1"] ="http://www.webserviceX.NET" def
> CDATAXml= respXmlHolder.getNodeValue("//ns1:GetWeatherResult/text()")
> log.info CDATAXml
def groovyUtils = new com.eviware.soapui.support.GroovyUtils(context)
def holder = groovyUtils.getXmlHolder(messageExchange.responseContent)
holder.namespaces["ns"] = "http://www.webserviceX.NET"
def weatherinfo= holder.getNodeValue("//ns:GetWeatherResult/text()")
log.info weatherinfo

Castor Mapping Arrays

I've got a class that has constructor:
public ContEmpirical(double xs[], double fs[]) {
Check.check(xs.length == fs.length + 1 && fs.length > 0,
"Empirical distribution array mismatch");
this.xs = new double[xs.length];
this.xs = xs;
this.cs = new double[xs.length];
double fTotal = 0.0;
for (int i = 0; i < fs.length; i++)
fTotal += fs[i];
cs[0] = 0;
for (int i = 0; i < fs.length; i++)
cs[i + 1] = cs[i] + fs[i] / fTotal;
}
Attributes:
private double xs[], cs[];
private double fs[]; // this attribute i added to make castors life easier since it always wants to map constructor arg to class attribute.
The mapping File i have is:
<class name="tools.ContEmpirical">
<description xmlns="">
Mapping for class tools.ContEmpirical
</description>
<map-to xml="ContEmpirical"/>
<field name="xs" type="double" collection="array" set-method="%1" get-method="getXs" required="true">
<bind-xml node="attribute"/>
</field>
<field name="fs" type="double" collection="array" set-method="%2" get-method="getFs" required="true">
<bind-xml node="attribute"/>
</field></class>
Yet when i try to marshall an instance of ContEmpirical I get this XML:
<ContEmpirical xs="0.2 0.3 0.4 0.8"/>
When really I should be getting something like this:
<ContEmpirical xs="0.2 0.3 0.4 0.8" fs="0.2 0.3 0.4 0.8"/>
Is there something I'm missing from the mapping?
Can you post the ContEmpirical class? Are you initialising the fs array to something in the field level?
UPDATE: I must be missing something. You said you "expect" fs in the output xml. But does the class have an attribute named fs? From the constructor code you have posted, fs is never assigned internally (there is a parameter called fs though which is used for computing cs).
So really, your object has only a xs and cs variable and in your xml if you declare mapping for cs you should get cs.

Categories

Resources