MongoDB: Copying an array to another array in the same document -


i'm working on online jewelry store mongodb, , need copy brands contained in "brands" array new array called "brandsnetherlands".

{                 "_id" : objectid("569d03b66abefa8be9c49f26"),             "brands" : [                     "brand1"                     "brand2"                     "brand3"             ],             "name" : "family jewels",             "address" : "",             "housenr" : "",             "postalcode" : "1234 aq",             "city" : "amsterdam",             "phone" : "+31 570 - 514200",             "email" : "jewelry@email.nl",             "web" : "http://www.familyjewels.nl/",             "kind" : "horloges",             "country" : "nederland",             "brandsnetherlands" : [ ] } 

this example of current build-up of 1 of documents contained in tbe "wholesalers" collection. need non-static script allow me move or copy brands listed in "brands" array empty "brandsnetherlands" array. possible?

for relatively small data, can achieve above iterating collection using snapshot cursor's foreach() method , updating each document follows:

db.wholesalers.find({      "brands": { "$exists": true, "$type": 4 }  }).snapshot().foreach(function(doc){      db.wholesalers.updateone(         { "_id": doc._id },         { "$set": { "brandsnetherlands": doc.brands } }     ); }); 

whilst optimal small collections, performance large collections reduced since looping through large dataset , sending each update operation per request server incurs computational penalty.

the bulk() api comes rescue , improves performance since write operations sent server once in bulk. efficiency achieved since method not send every write request server (as current update statement within foreach() loop) once in every 1000 requests, making updates more efficient , quicker is.

using same concept above foreach() loop create batches, can update collection in bulk follows.

in demonstration bulk() api available in mongodb versions >= 2.6 , < 3.2 uses initializeunorderedbulkop() method execute in parallel, in nondeterministic order, write operations in batches:

var bulk = db.wholesalers.initializeunorderedbulkop(), counter = 0; // counter keep track of batch update size

db.wholesalers.find({      "brands": { "$exists": true, "$type": 4 }  }).snapshot().foreach(function(doc){       bulk.find({ "_id": doc._id }).updateone({          "$set": { "brandsnetherlands": doc.brands }      });      counter++; // increment counter     if (counter % 1000 == 0) {         bulk.execute(); // execute per 1000 operations , re-initialize every 1000 update statements         bulk = db.wholesalers.initializeunorderedbulkop();     } }); 

the next example applies new mongodb version 3.2 has since deprecated bulk() api , provided newer set of apis using bulkwrite().

it uses same cursors above creates arrays bulk operations using same foreach() cursor method push each bulk write document array. because write commands can accept no more 1000 operations, there's need group operations have @ 1000 operations , re-intialise array when loop hits 1000 iteration:

var cursor = db.wholesalers.find({          "brands": { "$exists": true, "$type": 4 }      }),     bulkupdateops = [];  cursor.snapshot().foreach(function(doc){      bulkupdateops.push({          "updateone": {             "filter": { "_id": doc._id },             "update": { "$set": { "brandsnetherlands": doc.brands } }          }     });      if (bulkupdateops.length === 1000) {         db.wholesalers.bulkwrite(bulkupdateops);         bulkupdateops = [];     } });           if (bulkupdateops.length > 0) { db.wholesalers.bulkwrite(bulkupdateops); } 

Comments

Popular posts from this blog

magento2 - Magento 2 admin grid add filter to collection -

Android volley - avoid multiple requests of the same kind to the server? -

Combining PHP Registration and Login into one class with multiple functions in one PHP file -