Skip to content

Commit

Permalink
relinking more s3 buckets
Browse files Browse the repository at this point in the history
  • Loading branch information
t101jv committed May 24, 2013
1 parent bac9c05 commit aa9293c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion cloud_quiz/README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ example.pig uses the function RDFSplit3(...) which is defined in myudfs.jar:
OPTION 1: Do nothing. example.pig is already configured to read
myudfs.jar from S3, through the line:

register s3n://uw-cse344-code/myudfs.jar
register s3n://uw-cse-344-oregon.aws.amazon.com/myudfs.jar


OPTION 2: do-it-yourself; run this on your local machine:
Expand Down
6 changes: 3 additions & 3 deletions cloud_quiz/example.pig
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
register s3n://uw-cse344-code/myudfs.jar
register s3n://uw-cse-344-oregon.aws.amazon.com/myudfs.jar

-- load the test file into Pig
raw = LOAD 's3n://uw-cse-344-oregon/cse344-test-file' USING TextLoader as (line:chararray);
raw = LOAD 's3n://uw-cse-344-oregon.aws.amazon.com/cse344-test-file' USING TextLoader as (line:chararray);
-- later you will load to other files, example:
--raw = LOAD 's3n://uw-cse-344-oregon/btc-2010-chunk-000' USING TextLoader as (line:chararray);
--raw = LOAD 's3n://uw-cse-344-oregon.aws.amazon.com/btc-2010-chunk-000' USING TextLoader as (line:chararray);

-- parse each line into ntriples
ntriples = foreach raw generate FLATTEN(myudfs.RDFSplit3(line)) as (subject:chararray,predicate:chararray,object:chararray);
Expand Down

0 comments on commit aa9293c

Please sign in to comment.