private void addToArray(List<String> path, BsonValue value, BsonValue parentNode) { final BsonArray target = parentNode.asArray(); String idxStr = path.get(path.size() - 1); if ("-".equals(idxStr)) { // see http://tools.ietf.org/html/rfc6902#section-4.1 target.add(value); } else { int idx = arrayIndex(idxStr.replaceAll("\"", ""), target.size(), false); target.add(idx, value); } }
public static BsonArray asBson(final BsonValue source, final BsonValue target, EnumSet<DiffFlags> flags) { final List<Diff> diffs = new ArrayList<Diff>(); List<Object> path = new ArrayList<Object>(0); // generating diffs in the order of their occurrence generateDiffs(diffs, path, source, target); if (!flags.contains(DiffFlags.OMIT_MOVE_OPERATION)) { // Merging remove & add to move operation compactDiffs(diffs); } if (!flags.contains(DiffFlags.OMIT_COPY_OPERATION)) { // Introduce copy operation introduceCopyOperation(source, target, diffs); } return getBsonNodes(diffs, flags); }
@Test public void testPatchAppliedCleanly() throws Exception { for (int i = 0; i < jsonNode.size(); i++) { BsonDocument node = jsonNode.get(i).asDocument(); BsonValue first = node.get("first"); BsonValue second = node.get("second"); BsonArray patch = node.getArray("patch"); String message = node.containsKey("message") ? node.getString("message").getValue() : ""; // System.out.println("Test # " + i); // System.out.println(first); // System.out.println(second); // System.out.println(patch); BsonValue secondPrime = BsonPatch.apply(patch, first); // System.out.println(secondPrime); Assert.assertThat(message, secondPrime, equalTo(second)); } }
@Test public void testSampleJsonDiff() throws Exception { for (int i = 0; i < jsonNode.size(); i++) { BsonValue first = jsonNode.get(i).asDocument().get("first"); BsonValue second = jsonNode.get(i).asDocument().get("second"); // System.out.println("Test # " + i); // System.out.println(first); // System.out.println(second); BsonArray actualPatch = BsonDiff.asBson(first, second); // System.out.println(actualPatch); BsonValue secondPrime = BsonPatch.apply(actualPatch, first); // System.out.println(secondPrime); Assert.assertTrue(second.equals(secondPrime)); } }
@Test public void testGeneratedJsonDiff() throws Exception { Random random = new Random(); for (int i = 0; i < 1000; i++) { BsonArray first = TestDataGenerator.generate(random.nextInt(10)); BsonArray second = TestDataGenerator.generate(random.nextInt(10)); BsonArray actualPatch = BsonDiff.asBson(first, second); // System.out.println("Test # " + i); // // System.out.println(first); // System.out.println(second); // System.out.println(actualPatch); BsonArray secondPrime = BsonPatch.apply(actualPatch, first).asArray(); // System.out.println(secondPrime); Assert.assertTrue(second.equals(secondPrime)); } }
@Test public void testRenderedOperationsExceptMoveAndCopy() throws Exception { BsonDocument source = new BsonDocument(); source.put("age", new BsonInt32(10)); BsonDocument target = new BsonDocument(); target.put("height", new BsonInt32(10)); EnumSet<DiffFlags> flags = DiffFlags.dontNormalizeOpIntoMoveAndCopy().clone(); //only have ADD, REMOVE, REPLACE, Don't normalize operations into MOVE & COPY BsonArray diff = BsonDiff.asBson(source, target, flags); // System.out.println(source); // System.out.println(target); // System.out.println(diff); for (BsonValue d : diff) { Assert.assertNotEquals(Operation.MOVE.rfcName(), d.asDocument().getString("op").getValue()); Assert.assertNotEquals(Operation.COPY.rfcName(), d.asDocument().getString("op").getValue()); } BsonValue targetPrime = BsonPatch.apply(diff, source); // System.out.println(targetPrime); Assert.assertTrue(target.equals(targetPrime)); }
Object decode(BsonArray bsonArray, Field field, BsonMapperConfig bsonMapperConfig) { MAPPER_LAYER_COUNTER.addCount(bsonMapperConfig); try { Class<?> fieldType = field.getType(); if (fieldType.isArray()) { return handleArrayForBsonArray(bsonArray, field, bsonMapperConfig); } else if (Collection.class.isAssignableFrom(fieldType)) { return handleCollectionForBsonArray(bsonArray, field, bsonMapperConfig); } else { throw new BsonMapperConverterException( String.format("field %s should be array or Collection because there is a BsonArray in BsonDocument,BsonName is %s", field.getName(), Utils.getBsonName(field))); } } finally { MAPPER_LAYER_COUNTER.reduceCount(); } }
private Object handleArrayForBsonArray(BsonArray bsonArray, Field field, BsonMapperConfig bsonMapperConfig) { ArrayList<Object> arrayList = new ArrayList<Object>(); Class<?> fieldClazz = field.getType(); for (BsonValue bsonValue : bsonArray) { if (bsonValue == null) { continue; } if (bsonValue.isArray()) { arrayList.add(decode(bsonValue.asArray(), field, bsonMapperConfig)); } else { Object javaValue; if (bsonValue.isDocument()) { javaValue = BsonValueConverterRepertory.getBsonDocumentConverter().decode(bsonValue.asDocument(), fieldClazz.getComponentType(), bsonMapperConfig); } else { javaValue = BsonValueConverterRepertory.getValueConverterByBsonType(bsonValue.getBsonType()).decode(bsonValue); } arrayList.add(javaValue); } } return arrayList.toArray((Object[]) Array.newInstance(fieldClazz.getComponentType(), 0)); }
public void encode(BsonArray bsonArray, Field field, Object fieldValue, BsonMapperConfig bsonMapperConfig) { MAPPER_LAYER_COUNTER.addCount(bsonMapperConfig); try { Class<?> fieldType = field.getType(); String bsonName = Utils.getBsonName(field); if (fieldType.isArray()) { handleArrayForEncodeDocument(bsonArray, field, (Object[]) fieldValue, bsonMapperConfig); } else if (Collection.class.isAssignableFrom(fieldType)) { handleCollectionForEncodeDocument(bsonArray, field, (Collection) fieldValue, bsonMapperConfig); } else { throw new BsonMapperConverterException("exception should be never happen.the field:" + bsonName + "should be array or Collection"); } } finally { MAPPER_LAYER_COUNTER.reduceCount(); } }
private ArrayList<BsonValue> getBsonValueList(Field field, Collection values, BsonMapperConfig bsonMapperConfig, Class<?> componentType) { ArrayList<BsonValue> arrayList = new ArrayList<BsonValue>(); for (Object o : values) { if (o == null) { continue; } Class<?> oClazz = o.getClass(); if (Utils.isArrayType(oClazz)) { BsonArray innerBsonArray = new BsonArray(); encode(innerBsonArray, field, o, bsonMapperConfig); arrayList.add(innerBsonArray); } if (componentType.isInstance(o)) { if (BsonValueConverterRepertory.isCanConverterValueType(componentType)) { arrayList.add(BsonValueConverterRepertory.getValueConverterByClazz(componentType).encode(o)); } else { BsonDocument arrayEle = new BsonDocument(); BsonValueConverterRepertory.getBsonDocumentConverter().encode(arrayEle, o, bsonMapperConfig); arrayList.add(arrayEle); } } else { throw new BsonMapperConverterException(String.format("array field has element which has different type with declaring componentType.field name: %s", field.getName())); } } return arrayList; }
public static BsonDocument getBsonDocument() { BsonDocument bsonObj = new BsonDocument().append("testDouble", new BsonDouble(20.777)); List<BsonDocument> list = new ArrayList<BsonDocument>(); list.add(bsonObj); list.add(bsonObj); byte[] bytes = new byte[3]; bytes[0] = 3; bytes[1] = 2; bytes[2] = 1; BsonDocument bsonDocument = new BsonDocument().append("testDouble", new BsonDouble(20.99)) .append("testString", new BsonString("testStringV")) .append("testArray", new BsonArray(list)); return new BsonDocument().append("testDouble", new BsonDouble(20.99)) .append("testString", new BsonString("testStringV")) .append("testArray", new BsonArray(list)) .append("bson_test", bsonDocument) .append("testBinary", new BsonBinary(bytes)) .append("testBsonUndefined", new BsonUndefined()) .append("testObjectId", new BsonObjectId()) .append("testStringObjectId", new BsonObjectId()) .append("testBoolean", new BsonBoolean(true)) .append("testDate", new BsonDateTime(time)) .append("testNull", new BsonNull()) .append("testInt", new BsonInt32(233)) .append("testLong", new BsonInt64(233332)); }
@Test public void testBsonDocumentDeSerialize() { BsonDocument document = new BsonDocument().append("a", new BsonString("MongoDB")) .append("b", new BsonArray(Arrays.asList(new BsonInt32(1), new BsonInt32(2)))) .append("c", new BsonBoolean(true)) .append("d", new BsonDateTime(0)); String json = oson.useAttribute(false).setValueOnly(true).serialize(document); String expected = "{\"a\":\"MongoDB\",\"b\":[1,2],\"c\":true,\"d\":0}"; assertEquals(expected, json); BsonDocument document2 = oson.deserialize(json, BsonDocument.class); assertEquals(expected, oson.serialize(document2)); }
private BsonDocument fromObject(Object value) { final BsonDocument document = new BsonDocument(); final TypeParser<?> metadata = getTypeParser(value.getClass()); final BsonArray lazyStack = new BsonArray(); for(PrimaryField field : metadata.getAllFields()) { final Object fieldValue = extractValue(value, field); final BsonValue parsedValue; checkRequired(field, fieldValue); parsedValue = topParser.toBson(fieldValue, field); if(parsedValue instanceof BsonLazyObjectId) { lazyStack.add(parsedValue); } else { document.put(field.getName(), parsedValue); } } document.append(SmofParser.ON_INSERT, lazyStack); return document; }
public static BsonValue filterValue(BsonValue value) { BsonValue returnedValue = QUESTION_MARK_BSON; if (value instanceof BsonDocument) { returnedValue = filterParameters((BsonDocument) value); } else if (value instanceof BsonArray) { BsonArray array = (BsonArray) value; array = array.clone(); returnedValue = array; int length = array.size(); for (int i = 0; i < length; ++i) { BsonValue bsonValue = array.get(i); array.set(i, filterValue(bsonValue)); } } return returnedValue; }
@Before @Override public void setUp() throws Throwable { super.setUp(); gridFSBucket = GridFSBuckets.create(database); filesCollection = initializeCollection(new MongoNamespace(getDefaultDatabaseName(), "fs.files")) .withDocumentClass(BsonDocument.class); chunksCollection = initializeCollection(new MongoNamespace(getDefaultDatabaseName(), "fs.chunks")) .withDocumentClass(BsonDocument.class); List<BsonDocument> filesDocuments = processFiles(data.getArray("files", new BsonArray()), new ArrayList<BsonDocument>()); if (!filesDocuments.isEmpty()) { filesCollection.insertMany(filesDocuments).timeout(30, SECONDS).first().toBlocking().first(); } List<BsonDocument> chunksDocuments = processChunks(data.getArray("chunks", new BsonArray()), new ArrayList<BsonDocument>()); if (!chunksDocuments.isEmpty()) { chunksCollection.insertMany(chunksDocuments).timeout(30, SECONDS).first().toBlocking().first(); } }
private List<BsonDocument> processFiles(final BsonArray bsonArray, final List<BsonDocument> documents) { for (BsonValue rawDocument : bsonArray.getValues()) { if (rawDocument.isDocument()) { BsonDocument document = rawDocument.asDocument(); if (document.get("length").isInt32()) { document.put("length", new BsonInt64(document.getInt32("length").getValue())); } if (document.containsKey("metadata") && document.getDocument("metadata").isEmpty()) { document.remove("metadata"); } if (document.containsKey("aliases") && document.getArray("aliases").getValues().size() == 0) { document.remove("aliases"); } if (document.containsKey("contentType") && document.getString("contentType").getValue().length() == 0) { document.remove("contentType"); } documents.add(document); } } return documents; }
@Before @Override public void setUp() throws Throwable { super.setUp(); gridFSBucket = GridFSBuckets.create(database); filesCollection = initializeCollection(new MongoNamespace(getDefaultDatabaseName(), "fs.files")) .withDocumentClass(BsonDocument.class); chunksCollection = initializeCollection(new MongoNamespace(getDefaultDatabaseName(), "fs.chunks")) .withDocumentClass(BsonDocument.class); List<BsonDocument> filesDocuments = processFiles(data.getArray("files", new BsonArray()), new ArrayList<BsonDocument>()); if (!filesDocuments.isEmpty()) { ObservableSubscriber<Success> filesInsertSubscriber = new ObservableSubscriber<Success>(); filesCollection.insertMany(filesDocuments).subscribe(filesInsertSubscriber); filesInsertSubscriber.await(30, SECONDS); } List<BsonDocument> chunksDocuments = processChunks(data.getArray("chunks", new BsonArray()), new ArrayList<BsonDocument>()); if (!chunksDocuments.isEmpty()) { ObservableSubscriber<Success> chunksInsertSubscriber = new ObservableSubscriber<Success>(); chunksCollection.insertMany(chunksDocuments).subscribe(chunksInsertSubscriber); chunksInsertSubscriber.await(30, SECONDS); } }
/** * Geospatial query * * @param key * should be indexed by 2dsphere * db.vertices.createIndex({"urn:oliot:ubv:mda:gps" : "2dsphere"}) * @param lon * @param lat * @param radius * in metres db.vertices.find({ "urn:oliot:ubv:mda:gps" : { $near : { * $geometry: { type: "Point", coordinates: [ -1.1673,52.93]}, * $maxDistance: 50000}}}) * * @return */ public Stream<ChronoVertex> getChronoVertexStream(String key, double lon, double lat, double radius) { HashSet<ChronoVertex> ret = new HashSet<ChronoVertex>(); BsonArray coordinates = new BsonArray(); coordinates.add(new BsonDouble(lon)); coordinates.add(new BsonDouble(lat)); BsonDocument geometry = new BsonDocument(); geometry.put("type", new BsonString("Point")); geometry.put("coordinates", coordinates); BsonDocument near = new BsonDocument(); near.put("$geometry", geometry); near.put("$maxDistance", new BsonDouble(radius)); BsonDocument geoquery = new BsonDocument(); geoquery.put("$near", near); BsonDocument queryDoc = new BsonDocument(); queryDoc.put(key, geoquery); MongoCursor<BsonDocument> cursor = vertices.find(queryDoc).projection(Tokens.PRJ_ONLY_ID).iterator(); while (cursor.hasNext()) { BsonDocument v = cursor.next(); ret.add(new ChronoVertex(v.getString(Tokens.ID).getValue(), this)); } return ret.parallelStream(); }
public Set<CachedVertexEvent> getVertexEventSet(final Direction direction, final BsonArray labels, final Long left, final AC tt, final int branchFactor) { HashSet<Long> labelIdxSet = null; if (labels != null) { labelIdxSet = convertToLabelIdxSet(labels); } if (direction.equals(Direction.OUT)) { return this.getOutVertexEventSet(labelIdxSet, left, tt, branchFactor); } else if (direction.equals(Direction.IN)) return this.getInVertexEventSet(labelIdxSet, left, tt, branchFactor); else { Set<CachedVertexEvent> ret = this.getOutVertexEventSet(labelIdxSet, left, tt, branchFactor); ret.addAll(this.getInVertexEventSet(labelIdxSet, left, tt, branchFactor)); return ret; } }
public BsonDocument putQuantityList(BsonDocument base, List<QuantityElement> quantityList) { BsonArray quantityArray = new BsonArray(); for (QuantityElement quantityElement : quantityList) { BsonDocument bsonQuantityElement = new BsonDocument("epcClass", new BsonString(quantityElement.getEpcClass())); if (quantityElement.getQuantity() != null) { bsonQuantityElement.put("quantity", new BsonDouble(quantityElement.getQuantity())); } if (quantityElement.getUom() != null) { bsonQuantityElement.put("uom", new BsonString(quantityElement.getUom())); } quantityArray.add(bsonQuantityElement); } base.put("quantityList", quantityArray); return base; }
public BsonDocument putChildQuantityList(BsonDocument base, List<QuantityElement> childQuantityList) { BsonArray quantityArray = new BsonArray(); for (QuantityElement quantityElement : childQuantityList) { BsonDocument bsonQuantityElement = new BsonDocument("epcClass", new BsonString(quantityElement.getEpcClass())); if (quantityElement.getQuantity() != null) { bsonQuantityElement.put("quantity", new BsonDouble(quantityElement.getQuantity())); } if (quantityElement.getUom() != null) { bsonQuantityElement.put("uom", new BsonString(quantityElement.getUom())); } quantityArray.add(bsonQuantityElement); } base.put("childQuantityList", quantityArray); return base; }
public BsonDocument putOutputQuantityList(BsonDocument base, List<QuantityElement> outputQuantityList) { BsonArray quantityArray = new BsonArray(); for (QuantityElement quantityElement : outputQuantityList) { BsonDocument bsonQuantityElement = new BsonDocument("epcClass", new BsonString(quantityElement.getEpcClass())); if (quantityElement.getQuantity() != null) { bsonQuantityElement.put("quantity", new BsonDouble(quantityElement.getQuantity())); } if (quantityElement.getUom() != null) { bsonQuantityElement.put("uom", new BsonString(quantityElement.getUom())); } quantityArray.add(bsonQuantityElement); } base.put("outputQuantityList", quantityArray); return base; }
public static BsonDocument addTemporalRelationFilterQuery(BsonDocument filter, LongInterval left, AC ss, AC se, AC es, AC ee) { if (filter == null) filter = new BsonDocument(); BsonArray and = new BsonArray(); if (ss != null) and.add(new BsonDocument(Tokens.START, new BsonDocument(ss.toString(), new BsonDateTime(left.getStart())))); if (se != null) and.add(new BsonDocument(Tokens.START, new BsonDocument(es.toString(), new BsonDateTime(left.getEnd())))); if (es != null) and.add(new BsonDocument(Tokens.END, new BsonDocument(se.toString(), new BsonDateTime(left.getStart())))); if (ee != null) and.add(new BsonDocument(Tokens.END, new BsonDocument(ee.toString(), new BsonDateTime(left.getEnd())))); return filter.append(C.$and.toString(), and); }
public static BulkOperationResult bulkUpsertDocuments( final MongoCollection<BsonDocument> coll, final BsonArray documents, final BsonDocument filter, final BsonDocument shardKeys) { Objects.requireNonNull(coll); Objects.requireNonNull(documents); ObjectId newEtag = new ObjectId(); List<WriteModel<BsonDocument>> wm = getBulkWriteModel( coll, documents, filter, shardKeys, newEtag); BulkWriteResult result = coll.bulkWrite(wm); return new BulkOperationResult(HttpStatus.SC_OK, newEtag, result); }
private BsonArray getAggregationMetadata(BsonDocument contentToTransform) { List<Optional<BsonValue>> ___aggrs = JsonUtils .getPropsFromPath(contentToTransform, "$." + AbstractAggregationOperation.AGGREGATIONS_ELEMENT_NAME); if (___aggrs == null || ___aggrs.isEmpty()) { return null; } Optional<BsonValue> __aggrs = ___aggrs.get(0); if (__aggrs == null || !__aggrs.isPresent()) { return null; } BsonValue _aggrs = __aggrs.get(); if (_aggrs.isArray()) { return _aggrs.asArray(); } else { return null; } }
static BsonDocument getExistsQueryObject(String[] fieldArr, String str, BsonBoolean isExist) { BsonArray conjQueries = new BsonArray(); for (String field : fieldArr) { BsonDocument query = new BsonDocument(); if (str != null) { str = encodeMongoObjectKey(str); query.put(field + "." + str, new BsonDocument("$exists", isExist)); } else { query.put(field, new BsonDocument("$exists", isExist)); } conjQueries.add(query); } if (conjQueries.size() != 0) { BsonDocument queryObject = new BsonDocument(); if (isExist.equals(BsonBoolean.TRUE)) queryObject.put("$or", conjQueries); else { queryObject.put("$and", conjQueries); } return queryObject; } else { return null; } }
/** * Return the edges incident to the vertex according to the provided direction * and edge labels. * * @param direction * the direction of the edges to retrieve * @param labels * the labels of the edges to retrieve * @return an iterable of incident edges */ public Iterable<CachedChronoEdge> getChronoEdges(final Direction direction, final BsonArray labels, final int branchFactor) { HashSet<Long> labelIdxSet = null; if (labels != null) { labelIdxSet = convertToLabelIdxSet(labels); } if (direction.equals(Direction.OUT)) { return this.getOutChronoEdges(labelIdxSet, branchFactor); } else if (direction.equals(Direction.IN)) return this.getInChronoEdges(labelIdxSet, branchFactor); else { return new MultiIterable<CachedChronoEdge>(Arrays.asList(this.getInChronoEdges(labelIdxSet, branchFactor), this.getOutChronoEdges(labelIdxSet, branchFactor))); } }
private Set<ChronoEdge> getOutChronoEdgeSet(final BsonArray labels, final int branchFactor) { HashSet<ChronoEdge> edgeSet = new HashSet<ChronoEdge>(); BsonDocument filter = new BsonDocument(); BsonDocument inner = new BsonDocument(); filter.put(Tokens.OUT_VERTEX, new BsonString(this.toString())); if (labels != null && labels.size() != 0) { inner.put(Tokens.FC.$in.toString(), labels); filter.put(Tokens.LABEL, inner); } Iterator<BsonDocument> it = null; if (branchFactor == Integer.MAX_VALUE) it = graph.getEdgeCollection().find(filter).projection(Tokens.PRJ_ONLY_ID).iterator(); else it = graph.getEdgeCollection().find(filter).projection(Tokens.PRJ_ONLY_ID).limit(branchFactor).iterator(); while (it.hasNext()) { BsonDocument d = it.next(); edgeSet.add(new ChronoEdge(d.getString(Tokens.ID).getValue(), this.graph)); } return edgeSet; }
static BsonDocument getINExtensionQueryObject(String type, String[] fields, String csv) { String[] paramValueArr = csv.split(","); BsonArray subStringList = new BsonArray(); for (int i = 0; i < paramValueArr.length; i++) { String val = paramValueArr[i].trim(); subStringList.add(converseType(val)); } if (subStringList.isEmpty() == false) { BsonArray subList = new BsonArray(); for (int i = 0; i < fields.length; i++) { BsonDocument sub = new BsonDocument(); sub.put(fields[i], new BsonDocument("$in", subStringList)); subList.add(sub); } BsonDocument subBase = new BsonDocument(); subBase.put("$or", subList); return subBase; } return null; }
static BsonArray getQuantityObjectList(List<QuantityElementType> qetList, Integer gcpLength) { BsonArray quantityList = new BsonArray(); for (int i = 0; i < qetList.size(); i++) { BsonDocument quantity = new BsonDocument(); QuantityElementType qet = qetList.get(i); if (qet.getEpcClass() != null) quantity.put("epcClass", new BsonString(getClassEPC(qet.getEpcClass().toString(), gcpLength))); if (qet.getQuantity() != 0) { quantity.put("quantity", new BsonDouble(qet.getQuantity())); } if (qet.getUom() != null) quantity.put("uom", new BsonString(qet.getUom().toString())); quantityList.add(quantity); } return quantityList; }
public Iterable<VertexEvent> getVertexEvents(final Direction direction, final BsonArray labels, TemporalType typeOfVertexEvent, final AC tt, final AC s, final AC e, final AC ss, final AC se, final AC es, final AC ee, Position pos) { if (typeOfVertexEvent == null) typeOfVertexEvent = this.temporalType; Set<ChronoEdge> edgeSet = vertex.getChronoEdgeSet(direction, labels, Integer.MAX_VALUE); // T -> T return edgeSet.parallelStream().map(edge -> { Long t = edge.getTimestamp(timestamp, tt); return edge.pickTimestamp(t); }).filter(edge -> edge != null).map(edgeEvent -> edgeEvent.getVertexEvent(direction.opposite())) .collect(Collectors.toSet()); }
public void capture(BsonObjectId dataID, Long eventTime, Set<String> epcList, BsonArray epcQuantities, String readPoint, String bizLocation, BsonArray sourceList, BsonArray destinationList) { ChronoGraph pg = Configuration.persistentGraph; if (epcList != null && !epcList.isEmpty()) { epcList.stream().forEach(object -> { MongoWriterUtil.addBasicTimestampProperties(pg, eventTime, object, readPoint, bizLocation, sourceList, destinationList); pg.addTimestampVertexProperty(object, eventTime, "data", dataID); }); } if (epcQuantities != null && !epcQuantities.isEmpty()) { epcQuantities.stream().forEach(classElem -> { MongoWriterUtil.addBasicTimestampProperties(pg, eventTime, classElem, readPoint, bizLocation, sourceList, destinationList); pg.addTimestampVertexProperty(classElem.asDocument().getString("epcClass").getValue(), eventTime, "data", dataID); }); } return; }
public static boolean doesRequestUsesDotNotation(BsonValue content) { if (content.isDocument()) { BsonDocument obj = content.asDocument(); return obj.keySet().stream().anyMatch(key -> { return key.contains("."); }); } else if (content.isArray()) { BsonArray objs = content.asArray(); return objs.stream().anyMatch(obj -> { if (obj.isDocument()) { return doesRequestUsesDotNotation(obj); } else { return true; } }); } else { return true; } }
private Stream<ChronoVertex> getOutChronoVertexStream(BsonArray labels, final int branchFactor, final boolean setParallel) { HashSet<ChronoVertex> vertexSet = new HashSet<ChronoVertex>(); BsonDocument filter = new BsonDocument(); BsonDocument inner = new BsonDocument(); filter.put(Tokens.OUT_VERTEX, new BsonString(this.toString())); if (labels != null && labels.size() != 0) { inner.put(Tokens.FC.$in.toString(), labels); filter.put(Tokens.LABEL, inner); } Iterator<BsonDocument> it = null; if (branchFactor == Integer.MAX_VALUE) it = graph.getEdgeCollection().find(filter).projection(Tokens.PRJ_ONLY_ID).iterator(); else it = graph.getEdgeCollection().find(filter).projection(Tokens.PRJ_ONLY_ID).limit(branchFactor).iterator(); while (it.hasNext()) { BsonDocument d = it.next(); vertexSet.add(new ChronoVertex(d.getString(Tokens.ID).getValue().split("\\|")[2], this.graph)); } if (setParallel) return vertexSet.parallelStream(); else return vertexSet.stream(); }
public static boolean doesRequestUsesUpdateOperators(BsonValue content) { if (content.isDocument()) { BsonDocument obj = content.asDocument(); return obj.keySet().stream().anyMatch(key -> { return UPDATE_OPERATORS.contains(key); }); } else if (content.isArray()) { BsonArray objs = content.asArray(); return objs.stream().allMatch(obj -> { if (obj.isDocument()) { return doesRequestUsesUpdateOperators(obj); } else { return true; } }); } else { return true; } }
static BsonArray getQuantityObjectList(List<QuantityElementType> qetList, Integer gcpLength) { BsonArray quantityList = new BsonArray(); for (int i = 0; i < qetList.size(); i++) { BsonDocument quantity = new BsonDocument(); QuantityElementType qet = qetList.get(i); if (qet.getEpcClass() != null) quantity.put("epcClass", new BsonString(getClassEPC(qet.getEpcClass().toString(), gcpLength))); if (qet.getQuantity().doubleValue() != 0) { quantity.put("quantity", new BsonDouble(qet.getQuantity().doubleValue())); } if (qet.getUom() != null) quantity.put("uom", new BsonString(qet.getUom().toString())); quantityList.add(quantity); } return quantityList; }
/** * Reading from BSON to GSON */ @Test public void bsonToGson() throws Exception { BsonDocument document = new BsonDocument(); document.append("boolean", new BsonBoolean(true)); document.append("int32", new BsonInt32(32)); document.append("int64", new BsonInt64(64)); document.append("double", new BsonDouble(42.42D)); document.append("string", new BsonString("foo")); document.append("null", new BsonNull()); document.append("array", new BsonArray()); document.append("object", new BsonDocument()); JsonElement element = TypeAdapters.JSON_ELEMENT.read(new BsonReader(new BsonDocumentReader(document))); check(element.isJsonObject()); check(element.getAsJsonObject().get("boolean").getAsJsonPrimitive().isBoolean()); check(element.getAsJsonObject().get("boolean").getAsJsonPrimitive().getAsBoolean()); check(element.getAsJsonObject().get("int32").getAsJsonPrimitive().isNumber()); check(element.getAsJsonObject().get("int32").getAsJsonPrimitive().getAsNumber().intValue()).is(32); check(element.getAsJsonObject().get("int64").getAsJsonPrimitive().isNumber()); check(element.getAsJsonObject().get("int64").getAsJsonPrimitive().getAsNumber().longValue()).is(64L); check(element.getAsJsonObject().get("double").getAsJsonPrimitive().isNumber()); check(element.getAsJsonObject().get("double").getAsJsonPrimitive().getAsNumber().doubleValue()).is(42.42D); check(element.getAsJsonObject().get("string").getAsJsonPrimitive().isString()); check(element.getAsJsonObject().get("string").getAsJsonPrimitive().getAsString()).is("foo"); check(element.getAsJsonObject().get("null").isJsonNull()); check(element.getAsJsonObject().get("array").isJsonArray()); check(element.getAsJsonObject().get("object").isJsonObject()); }