For the business, all this means efficiency, lower cost, better performance, centralized/standardized systems/applications/coding/etc. . . . in the end, my focus is the entire "business process" rather than the components of the IT system.
It is quite surprising to find so many businesses do Quant computations in Excel on desktop machines . . .
A database server is not simply a filing cabinet . . . relative within most organizations, they are among the most powerful computers. Now that the SQL Server database software has access to the highly efficient .NET CLR coding system, desktop applications (e.g. Excel) need only take care of presentation and user "click" actions which are within the processing power of the desktop computer. Otherwise, "data set" level computations are generally best performed on the database server . . . that's one of the primary functions of a DB Server! Over and above, there are substantial efficiencies in avoiding back-and-forth "mass" transfers of data from the DB server across the network to the application on a desktop machine. It makes no sense to have a desktop machine perform huge computations when the much more powerful DB server can do them faster with greater accuracy. Why not have the DB Server deliver to the desktop application one "average number" rather than the 10,000 individual numbers used to compute that average? While nothing matches the native DB coding system (TSQL) for calling up, filtering, and sorting data, it is not as efficient as the .NET CLR with computations. Hence, the DB Server having access to the much more efficient .NET CLR relieves and restores processing power! After all, the DB Server already has the data called up and in hand . . . any addtional overhead related to the computations is recovered several times over by the efficiency gains already noted.
I've already fully integrated QLNet into a database and it works great.
Thanks but . . . the original question was "what", not "how". I suppose the question could be refined to "what is hot" right now.