Jump to content

Kaffffka sparkk streaming hbase


Recommended Posts

Posted
7 minutes ago, MuPaGuNa said:

poyi poyi inka ............

im sweet 'chex'teen man...even  i dont know how to swim...

ade ade 

spark and edo hbase integration anta

panipoori

  • Replies 63
  • Created
  • Last Reply

Top Posters In This Topic

  • vendettaa

    23

  • MuPaGuNa

    7

  • panipoori

    7

  • JollyBoy

    4

Posted

stream vachaka habse lo ki something like this ani undhi

val lines = stream.map(_.value())

lines.print()

lines.foreachRDD{rdd=>

rdd.foreachPartition(iter => {

val hConf = HBaseConfiguration.create()

val hTable = new HTable(hConf, "*****")

iter.foreach(record => {

val str1 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(0).split(":")(1)

val str2 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(1).split(":")(1)

val str3 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(2).split(":")(1)

val str4 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(3).split(":")(1)

val str5 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(4).split(":")(1)

val str6 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(5).split(":")(1)

val str7 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(6).split(":")(1)

val str8 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(7).split(":")(1)

val str9 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(8).split(":")(1)

val str10 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(9).split(":")(1)

val str11 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(10).split(":")(1).split("}")(0)

val id_con = str1

val id = id_con.toString

val thePut = new Put(Bytes.toBytes(id))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_**"), Bytes.toBytes(str1))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str2))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str3))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str4))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str5))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str6))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_A****"), Bytes.toBytes(str7))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str8))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str9))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str10))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str11))

hTable.put(thePut);

})

})

}

ssc.start()

Posted
2 hours ago, vendettaa said:

ela jargutunay pelli chupulu uncle

vaunty knchm itharulaki kuda hlp cheyyachu ga MuPaGuNa

Posted
1 hour ago, Run said:

stream vachaka habse lo ki something like this ani undhi

val lines = stream.map(_.value())

lines.print()

lines.foreachRDD{rdd=>

rdd.foreachPartition(iter => {

val hConf = HBaseConfiguration.create()

val hTable = new HTable(hConf, "*****")

iter.foreach(record => {

val str1 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(0).split(":")(1)

val str2 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(1).split(":")(1)

val str3 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(2).split(":")(1)

val str4 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(3).split(":")(1)

val str5 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(4).split(":")(1)

val str6 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(5).split(":")(1)

val str7 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(6).split(":")(1)

val str8 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(7).split(":")(1)

val str9 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(8).split(":")(1)

val str10 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(9).split(":")(1)

val str11 = record.substring(0, record.length - 1).split("\"payload\":")(1).split(",")(10).split(":")(1).split("}")(0)

val id_con = str1

val id = id_con.toString

val thePut = new Put(Bytes.toBytes(id))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_**"), Bytes.toBytes(str1))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str2))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str3))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str4))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str5))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str6))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_A****"), Bytes.toBytes(str7))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str8))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str9))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_****"), Bytes.toBytes(str10))

thePut.add(Bytes.toBytes("CUST"), Bytes.toBytes("CUST_*****"), Bytes.toBytes(str11))

hTable.put(thePut);

})

})

}

ssc.start()

@vendettaa sapudu ledhu. workoout ayyindha leda

Posted
3 minutes ago, Run said:

@vendettaa sapudu ledhu. workoout ayyindha leda

idi work out kadu 

intaki hbase credentials ekda istav

Posted
    def putRequest(t: (String, Long)) = {
      val p = new Put(Bytes.toBytes(t._1))
      p.add(Bytes.toBytes("line"), Bytes.toBytes(t._2))
    }

    def convert(t: (String, Long)) = {
      val p = new Put(Bytes.toBytes(t._1))
      p.add(Bytes.toBytes("line"), Bytes.toBytes(t._2))
      (t._1, p)
    }

  }
}

 

Error:(104, 9) overloaded method value add with alternatives:
  (x$1: org.apache.hadoop.hbase.Cell)org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: java.nio.ByteBuffer,x$3: Long,x$4: java.nio.ByteBuffer)org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: Array[Byte],x$3: Long,x$4: Array[Byte])org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: Array[Byte],x$3: Array[Byte])org.apache.hadoop.hbase.client.Put
 cannot be applied to (Array[Byte], Array[Byte])
      p.add(Bytes.toBytes("lines"), Bytes.toBytes(t._2))

 

Error:(109, 9) overloaded method value add with alternatives:
  (x$1: org.apache.hadoop.hbase.Cell)org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: java.nio.ByteBuffer,x$3: Long,x$4: java.nio.ByteBuffer)org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: Array[Byte],x$3: Long,x$4: Array[Byte])org.apache.hadoop.hbase.client.Put <and>
  (x$1: Array[Byte],x$2: Array[Byte],x$3: Array[Byte])org.apache.hadoop.hbase.client.Put
 cannot be applied to (Array[Byte], Array[Byte])
      p.add(Bytes.toBytes("lines"), Bytes.toBytes(t._2))

 

 

help me on this

Posted

Db lo vudyogaalu eppati nundi isthunnaru vaa.. .. sai vuncle cheppaledu naaku.. only garals ke na leka maga sannasulaki kooda na

Posted
val messages = KafkaUtils.createDirectStream[String, String](
  ssc,
  LocationStrategies.PreferConsistent,
  ConsumerStrategies.Subscribe[String, String](topicsSet, kafkaParams))
val lines = messages.map(_.value)
import sqlContext.implicits._

lines.foreachRDD(rdd => rdd.foreach(println))
println("44444444444444444")
lines.foreachRDD(raw=>{
   val xy = raw.map(_.value.toString)
   val df =sqlContext.read.json(xy)
 })

 

ikda Error:(148, 27) value value is not a member of String
       val xy = raw.map(_.value.toString)

 

😫😫

Posted
On 12/31/2018 at 1:03 PM, Paidithalli said:

Typecasting issue aa string to long 

Thanks DB lo oka ID helping me out

Thanks for him

Posted
On 12/28/2018 at 3:26 PM, jajjanaka_jandri said:

hbase, spark streaming, kafka hadoop kindaki raava?

avi different clusters, they can be intergrated with hadoop but hadoop okate kadu , they work with MPP, AWS too, but hbase runs upon hdfs 

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...