Hi I'd like to say sorry in advance for my lack of knowledge not only in this particular area but also generally in everything and I'm afraid this will be a really dumb question, but I couldn't get the answer from any searches on google nor in here so here I go.. (if there's a better subreddit for this kind of question, please tell me so)
I guess companies like Amazon or Netflix use the machine learning techniques in order to recommend new movies/products the user is likely to be interested etc.
The web part would be pretty straightforward like.. when a user buys a product, the database will be modified such that e.g. the corresponding User object now has an entry to the corresponding Product object etc. We'd have thousands and millions of data like this and we'd be able to predict what the user would like to buy and display in the Recommended section on the web using some machine learning techniques.
So exactly how is this done? We'd have separate servers dedicated in periodically running the algorithms that pull off this immense data from database, compute, and modify each of the corresponding User object so that it now contains the new computed data for its Recommended field?
I remember using MathLab in the online course but I don't think it's suitable for this kind of job, it's more like an analysis tool. What technology would they use to run all ML algo's and for scaling as well?
[link][9 comments]