2

I have problem with importing big xml file (1.3 gb) into mongodb in order to search for most frequent words in map & reduce manner.

http://dumps.wikimedia.org/plwiki/20141228/plwiki-20141228-pages-articles-multistream.xml.bz2

Here I enclose xml cut (first 10 000 lines) out from this big file:

http://www.filedropper.com/text2

I know that I can't import xml directly into mongodb. I used some tools do so. I used some python scripts and all has failed.

Which tool or script should I use? What should be a key & value? I think the best solution to find most frequent world would be this.

(_id : id, value: word )

then I would sum all the elements like in docs example:

http://docs.mongodb.org/manual/core/map-reduce/

Any clues would be greatly appreciated, but how to import this file into mongodb to have collections like that?

(_id : id, value: word )

If you have any idea please share.

Edited After research, I would use python or js to complete this task.

I would extract only words in <text></text> section which is under /<page><revision>, exlude &lt, &gt etc., and then separate words and upload them to mongodb with pymongo or js.

So there are several pages with revision and text.

Edited

5
  • Does anyone know how to convert such a big file, text section , into csv or json Commented Jan 8, 2015 at 23:02
  • the problem of big files, can be solved with fileinput, because you will load only one line at once, and not the whole file will be loaded to memory, then you decide when you will write to another file (csv or json). Commented Jan 9, 2015 at 0:38
  • Can you give me an example? Commented Jan 9, 2015 at 0:41
  • i made this, since the resulting file will be really big, then using open will use all memory, github.com/abdelouahabb/kouider-ezzadam/blob/master/… Commented Jan 9, 2015 at 1:15
  • I treid doing that and also stackoverflow.com/questions/19286118/… and got memory error.... Commented Jan 9, 2015 at 9:16

2 Answers 2

1

To save all this data, save them on Gridfs

And the easiest way to convert the xml, is to use this tool to convert it to json and save it:

https://stackoverflow.com/a/10201405/861487

import xmltodict

doc = xmltodict.parse("""
... <mydocument has="an attribute">
...   <and>
...     <many>elements</many>
...     <many>more elements</many>
...   </and>
...   <plus a="complex">
...     element as well
...   </plus>
... </mydocument>
... """)

doc['mydocument']['@has']
Out[3]: u'an attribute'
Sign up to request clarification or add additional context in comments.

3 Comments

Thank you for your help, but it doesn't work. I have even installed both xmltodict modules(one that you included and 2 official but "object has no atribute parse..." I think I should extract and prepare the data before an upload. Sth like : stackoverflow.com/questions/18595791/…
i just tested it, and it works, did the example worked?
Yes it works. I have a different idea how to import to mongodb. Could you take a look at: stackoverflow.com/questions/27841981/…, please. If this is resolved, I will handle.
0

The XML file i'm using goes this way :

<labels>
     <label>
          <name>Bobby Nice</name>
          <urls>
               <url>www.examplex.com</url>
               <url>www.exampley.com</url>
               <url>www.examplez.com</url>
          </urls>
     </label>
     ...
</labels>

and i can import it using xml-stream with mongodb

See: https://github.com/assistunion/xml-stream

Code:

var XmlStream = require('xml-stream');
// Pass the ReadStream object to xml-stream
var stream = fs.createReadStream('20080309_labels.xml');
var xml = new XmlStream(stream);

var i = 1;
var array = [];
xml.on('endElement: label', function(label) {
  array.push(label);
  db.collection('labels').update(label, label, { upsert:true }, (err, doc) => {
    if(err) {
      process.stdout.write(err + "\r");
    } else {
      process.stdout.write(`Saved ${i} entries..\r`);
      i++;
    }
  });
});

xml.on('end', function() {
  console.log('end event received, done');
});

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.