Integration sampled data in R -


i have measuring data sampled on time , want integrate it, test dataset contains ~100000 samples (~100s, 1000hz) of data.

my first approach (table contains timestamp (0..100s) , value of each data point (both doubles))

# test dataset available (gzipped, 720k) here: http://tux4u.de/so.rtab.gz table <- read.table("/tmp/so.rtab", header=true) time <- table$t data <- table$val start <- min(time) stop <- max(time) sampling_rate <- 1000 divs <- (max(time) - min(time)) * sampling_rate data_fun <- approxfun(time, data, method="linear", 0, 0) result <- integrate(data_fun, start, stop, subdivisions=divs) 

but somehow integration runs forever (like endless loop , eats 1 cpu completely). looked @ values:

> start [1] 0 > stop [1] 98.99908 > divs [1] 98999.08 

the strange thing when evaluate

> integrate(data_fun, 0, 98, subdivisions=100000)$value + integrate(data_fun, 98, 99)$value [1] 2.640055 

it works (computation time <3s) following evaluation (should same)

> integrate(data_fun, 0, 99, subdivisions=100000)$value 

never terminates, too. , 1 (which in fact subintegral of 1 working above) not terminate:

> integrate(data_fun, 0, 89, subdivisions=100000)$value 

it seems bit random me when works , when doesn't. doing wrong or improve process somehow?

thanks!

(hint: the sampling points not distributed equally)

ekhem, know may sum up? cumsum fast:

cumsum(table$val)*diff(table$t)[1] 

for unequal differences, may use:

cumsum(table$val[-nrow(table)]*diff(table$t)) 

there no need of more complex numerics since data in case densly sampled; nevertheless there better methods going through interpolator.


Comments

Popular posts from this blog

c# - SharpSVN - How to get the previous revision? -

c++ - Is it possible to compile a VST on linux? -

url - Querystring manipulation of email Address in PHP -