Using MongoDB 3.4.4 and newer versions:
db.collection.aggregate([
{ "$addFields": {
"data.visits.daily": {
"$arrayToObject": {
"$filter": {
"input": { "$objectToArray": "$data.visits.daily" },
"as": "el",
"cond": {
"$and": [
{ "$gte": ["$$el.k", "2018-09-06"] },
{ "$lte": ["$$el.k", "2018-09-07"] },
]
}
}
}
}
} }
])
The above pipeline will yield the final output
{
"data" : {
"visits" : {
"daily" : {
"2018-09-06" : 2969,
"2018-09-07" : 2624
}
}
}
}
Explanations
The pipeline can be decomposed to show each individual operator's results.
$objectToArray
$objectToArray enables you to transform the document with dynamic keys
into an array that contains a element for each field/value pair in the original document. Each element in the return array is a document that contains two fields k and v.
Running the pipeline with just the operator in a $project stage
db.collection.aggregate([
{ "$project": {
"keys": { "$objectToArray": "$data.visits.daily" }
} }
])
yields
{
"_id" : ObjectId("5bab6d09b1951fef20a5dce4"),
"keys" : [
{
"k" : "2018-09-05",
"v" : 3586
},
{
"k" : "2018-09-06",
"v" : 2969
},
{
"k" : "2018-09-07",
"v" : 2624
},
{
"k" : "2018-09-08",
"v" : 2803
},
{
"k" : "2018-09-09",
"v" : 3439
},
{
"k" : "2018-09-10",
"v" : 3655
}
]
}
$filter
The $filter operator acts as a filtering mechanism for the array produced by the $objectToArray operator, works by selecting a subset of the array to return based on the specified condition which
becomes your query.
Consider the following pipeline which returns an array of the key/value pair that matches the condition "2018-09-06" <= key <= "2018-09-07"
db.collection.aggregate([
{ "$project": {
"keys": {
"$filter": {
"input": { "$objectToArray": "$data.visits.daily" },
"as": "el",
"cond": {
"$and": [
{ "$gte": ["$$el.k", "2018-09-06"] },
{ "$lte": ["$$el.k", "2018-09-07"] },
]
}
}
}
} }
])
which yields
{
"_id" : ObjectId("5bab6d09b1951fef20a5dce4"),
"keys" : [
{
"k" : "2018-09-06",
"v" : 2969
},
{
"k" : "2018-09-07",
"v" : 2624
}
]
}
$arrayToObject
This will transform the filtered array above from
[
{
"k" : "2018-09-06",
"v" : 2969
},
{
"k" : "2018-09-07",
"v" : 2624
}
]
to the original document with the dynamic key
{
"2018-09-06" : 2969,
"2018-09-07" : 2624
}
so running the pipeline
db.collection.aggregate([
{ "$project": {
"keys": {
"$arrayToObject": {
"$filter": {
"input": { "$objectToArray": "$data.visits.daily" },
"as": "el",
"cond": {
"$and": [
{ "$gte": ["$$el.k", "2018-09-06"] },
{ "$lte": ["$$el.k", "2018-09-07"] },
]
}
}
}
}
} }
])
will produce
{
"_id" : ObjectId("5bab6d09b1951fef20a5dce4"),
"keys" : {
"2018-09-06" : 2969,
"2018-09-07" : 2624
}
}
But of course you would want to preserve the original schema i.e. the current fields so you would need to use $addFields instead of the $project pipeline used for illustrated.
$addFields
This is is equivalent to a $project stage that explicitly specifies all existing fields in the input documents and adds the new fields. Specifying an existing field name in an $addFields operation causes the original field to be replaced and you would need to use dot notation to to update the embedded data.visits.daily field with the dynamic keys.