ub16
JF-Expert Member
- Feb 19, 2013
- 420
- 316
Remember space complexity bro. afu tangu kwenye ile thread nmeona hauangalii sana efficiency ya algorithms
bro acha kukataa ukweli unaoonekana waziwazi, 100gb is a lot of data, otherwise uwe una computer yenye uwezo mkubwa sana(This is expensive)
Programmers wengi unakuta hawana somo la algorithms ili liwasaidie ku analyze programs zao. data ikiwa kubwa hivo mkuu inaleta ttzo, hivo inabidi ujaribu kuzipunguza tena kwa njia mbalimbali
You can use big data tools to analyse 100gb of data, but that doesn't make it big data.
Your computer has more storage than that, it can index it and comb through it with decent performance. Real example is apple, they pride themselves in privacy by doing all the machine learning and data processing on the device, nothing is being sent to the server. All that in an iPhone with an ARM CPU and a specialised co-processor and 64gb or 256gb storage. Au iPhone nayo ni computer yenye uwezo mkubwa sana?
You can easily manipulate 100gb of data through various techniques, you can throw around words like algorithms and complexity as you want but 100gb is not big data. SQLite tu can store 140 terabytes, doesn't mean you should have a 140TB database but just to put into perspective how 100gb is nothing.