I am not going to use fancy slides or (un) supported data to articulate my viewpoint. So at the risk of coming across as very un-investment bankish, I am going to make some bold predictions about Big Data.
I use the term Big Data as being insightful and predicting. If you have a truck load of data that just sits in some file system, that's just dumb data. For that data to be useful, decision makers should be able to derive insights from it or, the more innovative example, the data feeds into a machine-learning, siri-like or Google Now-like engine based on natural language processing.
I use the term Big Data as being insightful and predicting. If you have a truck load of data that just sits in some file system, that's just dumb data. For that data to be useful, decision makers should be able to derive insights from it or, the more innovative example, the data feeds into a machine-learning, siri-like or Google Now-like engine based on natural language processing.
With machine generated Big Data, the system / engine / concept has to undergo a series of trial and errors. For machine-learning engines based on Big Data to make accurate decisions or prediction, there has to be data about me or you that tells the Big Data engine what you like and what you don't. Consider this. LinkedIn sends you a notification message stating that someone completely unrelated to you, professionally speaking, with no shared connections or interests, wants to reach out to you. If you click on 'Ignore,' LinkedIn's engine will capture or 'learn' that I am not interested in a particular person from a particular industry, etc. Hopefully, in the future, LinkedIn will send me more 'relevant' notifications. The more notifications it sends me and the more you interact with the system, the more data is being generated, thereby enabling LinkedIn to learn about my preferences.
So you see what I mean about Big Data being a trial and error concept. Big Data-based machine-learning will stay in a perpetual state of trials and errors because it has to be constantly more accurate at every interaction with a human, or, in the future, with another connected device. In a way, you can think about the machine-learning curve to extend out into infinity until Technological Singularity - a topic I've covered in an earlier post. Technological Singularity is the convergence of humans and machine. The human will always be the end user though. Well, that depends on what you define as 'human' in the next 50 years. It may be an entity with a consciousness, period.
I use LinkedIn only as an example to illustrate Big Data based engines, but one could use any machine-learning engine based on natural language processing or sentiment analysis engine, all layered on Big Data. Another great example is a startup in beta mode, Stitch Fix. Stitch Fix's model is to ship a box of clothes to women based on their dressing style. I am not going to delve in to the details of the business model, but the point I am trying to make is that every time the users ships back the clothes she doesn't want to keep, the engine updates itself to reflect a more accurate sense of the user's dressing style. So the next time, Stitch Fix sends a box of clothes to the user, it hopes to send clothes that is as per the user's preference. Again, this illustrates the constant trial and error learning based on all that data the engine collects about the user - Big Data!
So you see what I mean about Big Data being a trial and error concept. Big Data-based machine-learning will stay in a perpetual state of trials and errors because it has to be constantly more accurate at every interaction with a human, or, in the future, with another connected device. In a way, you can think about the machine-learning curve to extend out into infinity until Technological Singularity - a topic I've covered in an earlier post. Technological Singularity is the convergence of humans and machine. The human will always be the end user though. Well, that depends on what you define as 'human' in the next 50 years. It may be an entity with a consciousness, period.
I use LinkedIn only as an example to illustrate Big Data based engines, but one could use any machine-learning engine based on natural language processing or sentiment analysis engine, all layered on Big Data. Another great example is a startup in beta mode, Stitch Fix. Stitch Fix's model is to ship a box of clothes to women based on their dressing style. I am not going to delve in to the details of the business model, but the point I am trying to make is that every time the users ships back the clothes she doesn't want to keep, the engine updates itself to reflect a more accurate sense of the user's dressing style. So the next time, Stitch Fix sends a box of clothes to the user, it hopes to send clothes that is as per the user's preference. Again, this illustrates the constant trial and error learning based on all that data the engine collects about the user - Big Data!
Like it? Share it!