• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

Wish list of financial functions for a database?

Joined
5/28/10
Messages
9
Points
11
I'm starting a financial database project and am looking for suggestions regarding algorithms/functions that could make sense for execution at the database. The objectives are to have a set of functions universally available through the database and to increase timeliness by returning (partially) computed data sets. I'd really appreciate practical ideas based on real world experience . . . the tried and true. I'm doing this in C# as SQL CLR objects using VS2010 and MSSQL 2008.
 
You probably refer to stored procedures (SP) and functions that you can call from your C# code?
I've done a few in the past though there are much more advanced methods available now in .NET 3.5+. LINQ is newer function that you may take advantage of.
Some of the most useful code snippet I did in MS SQL was a few function that check for name changes, firm merge, etc. When you deal with several names in your portfolio, a single name change may screw the whole thing up if you don't pay attention to little detail.
This pretty much depends on what you plan to do and what product, dataset you use.
 
putting business logic on your database layer is a short path to disaster
 
ERP

Disagree. Most ERP systems build business logic into a business driven workbench which sit on a database. It is difficult therefore it is expensive.

It is not easy but it can be done depending on how good you really know the business processes.
 
The solution depends on the task you are trying to accomplish; without these details there is no ultimate magic bullet. Unless required so, you might not want to do all logic on a database level and adding it using SQL CLR or through Transact SQL and SQL objects. However, it is good idea to return partially formatted datasets and oscillators using build-in types, functions and TSQL like good old SELECT statement; customize it with a CTE and do not create a temp table of a GB size just to calculate the average. Use scar functions (Functions (Transact-SQL)), they are very powerful and they will eliminate the need to return gigabytes of data to the app level to calculate some avg; for that there are Aggregate Functions.
A proper database design should be considered provided there is a requirement for an hpc. You will need to deal with optimizations, indexes and proper hardware design as well; certain RAID level will be needed and even separate location for certain tables.
 
I agree with the cons folks below, I'd add:
- PL/SQL isn't the most powerful/fun language out there (and pl/sql is probably the best one --tying you up to oracle ad vitam eternam)
- your data will have to live in the database. Your code can't get input from files/streams/you name it .... seems limitting
 
Thanks for all the good input. Actually, I'm asking for help with "features" shall we say . . . stuff users want that's useful to them. For example, I've already done some basic things like options pricing/greeks, distribution curves, etc. As for the architecture/coding, I have that pretty well under control (I think). I've been around for a while and am getting bored with routine DB admin/dev. I like math and this looks like a good way to combine experience with something fun.
 
hey mike, ever get this together? i'm looking to do the same but would prefer not to duplicate effort
 
Yes and no . . . appearently, the sky is the limit. My impression is just about everything is custom so it depends for whom you do the work. I went with QLNet as a starter.
 
You should think about which functions fit better in the database layer as compared to the application layer. In most cases, your other functions would be wrapped around these functions. But trying to fit everything in stored procedures is similar to excessive denormalization and will be very inefficient and will not scale.
 
For the business, all this means efficiency, lower cost, better performance, centralized/standardized systems/applications/coding/etc. . . . in the end, my focus is the entire "business process" rather than the components of the IT system.

It is quite surprising to find so many businesses do Quant computations in Excel on desktop machines . . .

A database server is not simply a filing cabinet . . . relative within most organizations, they are among the most powerful computers. Now that the SQL Server database software has access to the highly efficient .NET CLR coding system, desktop applications (e.g. Excel) need only take care of presentation and user "click" actions which are within the processing power of the desktop computer. Otherwise, "data set" level computations are generally best performed on the database server . . . that's one of the primary functions of a DB Server! Over and above, there are substantial efficiencies in avoiding back-and-forth "mass" transfers of data from the DB server across the network to the application on a desktop machine. It makes no sense to have a desktop machine perform huge computations when the much more powerful DB server can do them faster with greater accuracy. Why not have the DB Server deliver to the desktop application one "average number" rather than the 10,000 individual numbers used to compute that average? While nothing matches the native DB coding system (TSQL) for calling up, filtering, and sorting data, it is not as efficient as the .NET CLR with computations. Hence, the DB Server having access to the much more efficient .NET CLR relieves and restores processing power! After all, the DB Server already has the data called up and in hand . . . any addtional overhead related to the computations is recovered several times over by the efficiency gains already noted.

I've already fully integrated QLNet into a database and it works great.

Thanks but . . . the original question was "what", not "how". I suppose the question could be refined to "what is hot" right now.
 
Back
Top