vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 2 hours ago, Kalam_Youtheman said: odiniyaavaaa . u know what jmeter is used for ? its purely for load testing basics telskoni ask questions baaalu load testing spark application mida kuda chestaru , might not be jmeter but yes untay @Kalam_Youtheman sarvagyani refer below link https://community.hortonworks.com/articles/138300/spark-loadperformance-testing-using-gatling-part-i.html Quote
vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 2 hours ago, Kalam_Youtheman said: basic illlaaa balayya ki avnu mari bavilo kappalu ila balayya illa ani comments cheydam tappa em vachu Quote
vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 1 hour ago, WeBeliveInTigerman said: Use JVM profiler n Spark Metrics..spark metrics give more than info for spark apps Thank you man finally one sensible reply Quote
soodhilodaaram Posted February 6, 2019 Report Posted February 6, 2019 8 hours ago, vendettaa said: Is there any tool like jmeter to do health checks on spark jar manishi health ke dikku ledu.. jar file ki eeda untadi.. Quote
vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 @tacobell fan @Kalam_Youtheman uncles https://www.supergloo.com/fieldnotes/apache-spark-thrift-server-load-testing-example/ exactly this is what am looking but inkemaina tools unte suggest chestarani but ikda mahanubhavulu topic ni divert chesi navvukuntunaru APACHE SPARK THRIFT SERVER LOAD TESTING EXAMPLE OVERVIEW How do simulate anticipated load on our Apache Spark Thrift Server? In this post, we are going to use an open source tool called Gatling. Check out the References section at the bottom of this post for links to Gatling. At a high level, this Spark Thrift with Gatling tutorial will run through all the following steps: Confirm our environment (Spark, Cassandra, Thrift Server) Compile our Gatling based load testing code Run a sample Spark Thrift load test Quote
Kalam_Youtheman Posted February 6, 2019 Report Posted February 6, 2019 11 hours ago, WeBeliveInTigerman said: Use JVM profiler n Spark Metrics..spark metrics give more than info for spark apps jmeter laaanti tool kavalanta.. ROFL.. time bokka Quote
mettastar Posted February 6, 2019 Report Posted February 6, 2019 Hey vendetta.. i dont know answer to your question but .. spark standalone ante interesting ga undi .. can you explain your use case ? Why preferred only spark .. without resource manager and distributed storage? Any advantages stand alone tho? Quote
tacobell fan Posted February 6, 2019 Report Posted February 6, 2019 Health check ki Load Test ki difference ledu ani argue chestunnav. Good luck. Quote
vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 7 hours ago, mettastar said: Hey vendetta.. i dont know answer to your question but .. spark standalone ante interesting ga undi .. can you explain your use case ? Why preferred only spark .. without resource manager and distributed storage? Any advantages stand alone tho? spark 2.4.0 ->structured streaming application ki ,prastutaniki poc's nadustunay built pipeline between kafka spark stttreaming caassandra , alage batch too present standalone , then shifting to Kubernetes anta dockers containers mayam Quote
vendettaa Posted February 6, 2019 Author Report Posted February 6, 2019 7 hours ago, Kalam_Youtheman said: jmeter laaanti tool kavalanta.. ROFL.. time bokka jmeter lantide kavali but for scala Quote
AdaviRamudu Posted February 6, 2019 Report Posted February 6, 2019 2 minutes ago, vendettaa said: jmeter lantide kavali but for scala Bangaram vachesindhi....... Quote
vendettaa Posted February 7, 2019 Author Report Posted February 7, 2019 Looking to schedule jobs that run on spark scheduler .Oozie is not better option so what are best options to schedule jobs? Dont say cron jobs Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.