couchdb - Bulk loading MongoDB from JSON file with a number of objects -
i want bulk load mongodb. have 200gb of files containing json objects want load, problem cannot use mongoimport tool objects contain objects (i.e. i'd need use --jsonarray aaram) limited 4mb.
there bulk load api in couchdb can write script , use curl send post request insert documents, no size limits...
is there in mongodb? know there sleepy wondering if can cope json nest array insert..?
thanks!
ok, appears there no real answer unless write own tool in java or ruby pass objects in (meh effort)... that's real pain instead decided split files down 4mb chunks... wrote simple shell script using split (note had split files multiple times because of limitations). used split command -l (line numbers) each file had x number of lines in it. in case each json object 4kb guessed line sizes.
for wanting remember split can make 676 files (26*26) need make sure each file has enough lines in avoid missing half files. way put in old bash script , used mongo import , let run overnight. easiest solution imo , no need cut , mash files , parse json in ruby/java or w.e. else.
the scripts bit custom, if wants them leave comment , ill post.
Comments
Post a Comment