Jump to content

shell scripting help kavali


vendettaa

Recommended Posts

Just now, kiraak_poradu said:

question endi?

ade probelm?

i have files in a folder i.e in  remote cluster , have to scp latest file  which is in remote cluster to the folder in current cluster .so if that file is processed then file will be removed from the current folder , if the file is not processed then file will be in folder and next time when i get the latest file into the folder ,latest file should be appended to the unprocessed file and then file is sent to be processed(processed using spark). 

diniki scripting with error logging kavali

Link to comment
Share on other sites

1 minute ago, vendettaa said:

i get the latest file into the folder ,latest file should be appended to the unprocessed file and then file is sent to be processed(processed using spark). 

What do you mean by Append? Just simply add contents of the latest file to unprocessed file? What if you wan to identify an issue between these files? If you append it will nightmare to know.

Link to comment
Share on other sites

3 minutes ago, vendettaa said:

i have files in a folder i.e in  remote cluster , have to scp latest file  which is in remote cluster to the folder in current cluster .so if that file is processed then file will be removed from the current folder , if the file is not processed then file will be in folder and next time when i get the latest file into the folder ,latest file should be appended to the unprocessed file and then file is sent to be processed(processed using spark). 

diniki scripting with error logging kavali

did u searching google thalli?

first?

 

 

Link to comment
Share on other sites

4 minutes ago, vendettaa said:

i have files in a folder i.e in  remote cluster , have to scp latest file  which is in remote cluster to the folder in current cluster .so if that file is processed then file will be removed from the current folder , if the file is not processed then file will be in folder and next time when i get the latest file into the folder ,latest file should be appended to the unprocessed file and then file is sent to be processed(processed using spark). 

diniki scripting with error logging kavali

Steps veyyi nee requirement ki.

Link to comment
Share on other sites

Just now, Idassamed said:

Steps veyyi nee requirement ki.

1) file located in remote cluster like /remote/folderremote/

 ->file1

->file2

->file3

->file4

 

2)scp the latest file .consider file4 is latest file loaded into folderremote

  scp file4 to my cluster to folder /vhome/sam/folder_name/file4

3) hadoop fs -put -f /vhome/sam/folder /user/folderhadoop

4)spark-submit job.jar

5)rm /vhome/sam/folder_name/file4

 

if spark job fails the file is not deleted and the file is appended with latest file  

and 3,4,5 steps are repeated

this whole job triggered for every 4 hours

Link to comment
Share on other sites

13 minutes ago, vendettaa said:

i have files in a folder i.e in  remote cluster , have to scp latest file  which is in remote cluster to the folder in current cluster .so if that file is processed then file will be removed from the current folder , if the file is not processed then file will be in folder and next time when i get the latest file into the folder ,latest file should be appended to the unprocessed file and then file is sent to be processed(processed using spark). 

diniki scripting with error logging kavali

Google kottu..chala scripts vuntai..idi very common in every office with ftp

Link to comment
Share on other sites

1 hour ago, Sreeven said:

Google kottu..chala scripts vuntai..idi very common in every office with ftp

google done

naku step by step kavali with error logging

so many things am into , na spark code refactoring 

and e script inko planet work laga anpistundi

am able to scp file and move to hdfs and run the spark 

but i want latest file to be scp and even get appended to current file , aa logic kavali

Link to comment
Share on other sites

13 minutes ago, vendettaa said:

google done

naku step by step kavali with error logging

so many things am into , na spark code refactoring 

and e script inko planet work laga anpistundi

am able to scp file and move to hdfs and run the spark 

but i want latest file to be scp and even get appended to current file , aa logic kavali

latest_file : ls -t | head -1

cat latest_file >> current,_file

I know you wanted the script.

 

Link to comment
Share on other sites

19 minutes ago, Idassamed said:

latest_file : ls -t | head -1

cat latest_file >> current,_file

I know you wanted the script.

 

ela apply chestanu  latest_file : ls -t | head -1

my script should run on current cluster 

i can just do scp of latest file 

how to use scp to move latest file

Link to comment
Share on other sites

40 minutes ago, vendettaa said:

ela apply chestanu  latest_file : ls -t | head -1

my script should run on current cluster 

i can just do scp of latest file 

how to use scp to move latest file

scp source_file  user@host:/directory/

 

#!/bin/bash

remote_dir="remote_directory"

remote_server="user@host"

Remote_file=scp $remote_server:$dir/$(ssh $remote_server 'ls -t $dir | head -1')

 

@vendettaa

 

Link to comment
Share on other sites

40 minutes ago, Idassamed said:

scp source_file  user@host:/directory/

 

#!/bin/bash

remote_dir="remote_directory"

remote_server="user@host"

Remote_file=scp $remote_server:$dir/$(ssh $remote_server 'ls -t $dir | head -1')

 

@vendettaa

 

$dir em ivvali ?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...