We are slowly phasing out R and Matlab as analysis tools, even in research, and 'replace' them via strongly typed languages (mainly C# and some
C++). Reason being that integration into production is quicker and they also seem to be more suitable for crunching through large(r) amounts of data.
<o:p> </o:p>
Most of the numerical algos you need, can be found within open-source libraries. If any specific functions are not available through these you can interface to R via the statconnector or go through the trouble of either marshalling or translating the C source (which doesn’t take very long anyway).
Personally, I find the arrival of linq a real blessing for preparing/ aligning data. Sure, it can be done in R/ Matlab as well, but I find linq much more convenient to use. We are dealing with quite massive datasets, so relational databases don’t cut it in terms of reading/ query speeds. Objects (de-)serialization from/into binary files to cope with the size and achieve better processing performance is the only alternative. Other than specialized, commercial databases, of course, which are not feasible economically at this stage (for us).
<o:p> </o:p>
The only overhead in terms of time and additional code you have from using C# is that you have to (a) format the output and (b) take care of stuff like charting yourself. You do the latter once, in an easily extensible framework and the additional overhead reduces to almost nothing. Also, the gain from VS’s debugger vs R’s, outweighs any overhead incurred otherwise.