I am launching a new app to make you free I am launching a new app to make you free Check it out here!
I have to admit that I have a thing for DSLs. You can see it at music-as-data were notes and rhythm/beat is "mapped" to data and you can apply data transformations.
The same thing I want to do with data at-rest.
Here is a scenario: I have lots of data sitting as CSV on my hard-drive and I want to process them. Not query them. Process them.
What would be really interesting is to be able to define (dynamically) a schema like that:
(defschema "EURUSD"
(tokenizer #(.split % ":"))
;; the mapping is done here
(columns |time| |open| |high| |low| |close| |volume|))
Let me explain. First of all, a "tokenize" function. Each dataline is tokenized based on a function. Do you want regex? Something more complex? You are free to write anything you like. I really hate frameworks that you must write a complex regular expression or use a compicated system just to tokenize a line.
As you can imagine, tokenize returns a list of data that are mapped to "columns".
Now, the interesting stuff.
You can write scripts like the following:
(if (> |close| 1.45)
(place-order :buy)
(place-order :sell))
Thoughts?
ping me here -> JR
- Thu 20 November 2014
Get weekly tips in your inbox.
No boring ordinary stories.