Applying data science techniques to production trading desks typically involves moving data sets around for what-if trial and error simulations - frequently without audit controls or change management mechanisms in place to tie insights gleaned to data sources.
Moreover, there’s direct value created if you can streamline your workflow and get your insights fast and correct. You may learn something valuable….from last week’s batch run. Is the insight still valid? Hope so. Did you learn that insight from the Hadoop cluster data lake or from the Snowflake data dump? Where did those files originate from again? Bob’s out this week.
The key learning topics of this webinar include: common scenarios where data science techniques are being used in trading situations, how lack of engineering rigor or audit controls may affect the results in these situations, and how the common data science notebook paradigm can be embedded in not just in your environment - but in your applications to apply that rigor.