Jump to content

Hadoop Or Cognos....tech Help Plzzz !


athapurbaba

Recommended Posts

  • Replies 76
  • Created
  • Last Reply

Top Posters In This Topic

  • athapurbaba

    26

  • sandie

    7

  • rapchik

    6

  • kiran karthik

    6

Rajendrudu and gajendrudu

nadhi .net bcgnd..wat do u say..? hadoop coaching tesukundham anukuntuna

Link to comment
Share on other sites

My tella boss saying hadoop will cool down after few years .....intially everyone will be excited to implement the hadoop ecosystem ..after few years when every hadoop environment gets big with data...nobody knows how to do performance tuning...bcoz its opens source...and all will be back to regular rdms....so he said ..six back relax and enjoy the freak show..

Link to comment
Share on other sites

My tella boss saying hadoop will cool down after few years .....intially everyone will be excited to implement the hadoop ecosystem ..after few years when every hadoop environment gets big with data...nobody knows how to do performance tuning...bcoz its opens source...and all will be back to regular rdms....so he said ..six back relax and enjoy the freak show..

dhanikosam ee kadha vadedhi./infact can regular Rdms handle huge data above 4TB ?? epude handle cheyalekapothe future lo inka pedhaga ayinaka etla handle chesthadhi...and aa RDMS ella tuning chesthahdi...so hadoop is the answer may new tool ravachu...kani RDMS for big data not a answer...tell him form my side ...if you keep watching everyone moves on and one day no one will be there to watch u and ur data    CITI_c$y  CITI_c$y  CITI_c$y   jk

Link to comment
Share on other sites

My tella boss saying hadoop will cool down after few years .....intially everyone will be excited to implement the hadoop ecosystem ..after few years when every hadoop environment gets big with data...nobody knows how to do performance tuning...bcoz its opens source...and all will be back to regular rdms....so he said ..six back relax and enjoy the freak show..

Yes, Currently its a freak show and most of market is driven by hype. Many companies are yet to utilize full potential of Big data but its just a matter of time 

 

dhanikosam ee kadha vadedhi./infact can regular Rdms handle huge data above 4TB ?? epude handle cheyalekapothe future lo inka pedhaga ayinaka etla handle chesthadhi...and aa RDMS ella tuning chesthahdi...so hadoop is the answer may new tool ravachu...kani RDMS for big data not a answer...tell him form my side ...if you keep watching everyone moves on and one day no one will be there to watch u and ur data    CITI_c$y  CITI_c$y  CITI_c$y   jk

We already have teradata which can handle >10tb of data efficiently infact in my current workplace TD db is >8tb. As you stated RDMS is definately not the answer for Big data but newer teechnologies are emerging to support vast amount of data.

Link to comment
Share on other sites

Yes, Currently its a freak show and most of market is driven by hype. Many companies are yet to utilize full potential of Big data but its just a matter of time 

 

We already have teradata which can handle >10tb of data efficiently infact in my current workplace TD db is >8tb. As you stated RDMS is definately not the answer for Big data but newer teechnologies are emerging to support vast amount of data.

But i think there is limit for teradata right? How many transactions it can handle if it is used in a application like twitter ?

Link to comment
Share on other sites

Hmm consultancy vadi suggestion baa adhi...bifdata ante enti...? Em tools untay andhuloo

Inka Cognos pattukuni ennallu oogutavu... move on... Edanna BifData tool nerchuko..


Like Splunk...

Reporting e ante... Tableau & MSTR better options

Link to comment
Share on other sites

So a a tools lo coding untundha bhayya ..? Mongo cassandra etc lo

Holda..Holda..dont scare the kids by telling they need to write...mappers ..reducers and rappers....nobody is writing rapers from scratch these days...everthing is customization of third party tools fo hadoop..

Learn hadoop concepts....i mean..just theory and concentrate on a tool ..like spark, hive,impala,cassandra,mongo..blah..blah...
the first question you will be asked in the interview will be not "what is hadoop"...but "how will you do performance tuning for hadoop environment"...

Link to comment
Share on other sites

Hmm consultancy vadi suggestion baa adhi...bifdata ante enti...? Em tools untay andhuloo

 

Splunk Spark BD related tools... UNIX,SQL DWBI Concepts teliste manchidi... Java vast inks manchidi mani mot must.

 

Paina discuss chesinattu ga... I feel like

 

"Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it..."

 

Maa company kooda use chestundi... just to process the raw data like data generated by social media etc.,

 

At the end of the day CXO level reporting/analytics ki Market lo unna tools e gati.... 

Link to comment
Share on other sites

But i think there is limit for teradata right? How many transactions it can handle if it is used in a application like twitter ?

 

Technically it all comes down to level of tuning and hardware (i.e use of 4gb/s fibre channel), In our case db is updated with approx 50m records a day. As far as twitter goes it stores a massive transactions of around 250M a day using a customized version of mysql with the combination of in-house developed storage solution called Manhattan

Link to comment
Share on other sites

×
×
  • Create New...