You cannot have big data without big volumes of data. If you can count the data on your fingers or with a simple traditional database solution then you are doing just that-working with simple business data. It is important to note though that in most cases big data starts out small. Data sizes should make your IT folks either sweat or look seriously at cloud based solutions. If that is not the case, you are probably just working with traditional sized data sets.
2. Who needs Hadoop?
is the Apache open-source software framework for working with Big Data. It was originally developed and deployed by several bay area/California tech giants. It has in a very short time become the de-facto standard for several small and medium sized companies that are trying to step into the world of big data, analytics and such. You should have a very good reason for not using Hadoop. Make sure you are convinced that the "reasoning" is good and not just sounds good.
3. Structured Data is All That I Care About
Big data does not have to include unstructured data but if your solution is unable to deal with it, then you are probably dealing with the traditional database. It might be large but it is still not big data. Big data must be able to deal with what will appear to be completely "random" looking data. Some examples of unstructured data are - social media sentiments, multi-media files, emails etc.