Deserializing Dates from mongodb with custom CodecProvider in Java gives null results - mongodb

I have implemented a custom MongoDB CodecProvider to map to my java objects, using this Github gist. However, i cannot deserialize Date values, rather null values are returned. Here is the snippet of my custom encoder implementation for my pojo - AuditLog:
public void encode(BsonWriter writer, AuditLog value, EncoderContext encoderContext) {
Document document = new Document();
DateCodec dateCodec = new DateCodec();
ObjectId id = value.getLogId();
Date timestamp = value.getTimestamp();
String deviceId = value.getDeviceId();
String userId = value.getUserId();
String requestId = value.getRequestId();
String operationType = value.getOperationType();
String message = value.getMessage();
String serviceName = value.getServiceName();
String className = value.getClassName();
if (null != id) {
document.put("_id", id);
}
if (null != timestamp) {
document.put("timestamp", timestamp);
}
if (null != deviceId) {
document.put("deviceId", deviceId);
}
if (null != userId) {
document.put("userId", userId);
}
if (null != requestId) {
document.put("requestId", requestId);
}
if (null != operationType) {
document.put("operationType", operationType);
}
if (null != message) {
document.put("message", message);
}
if (null != serviceName) {
document.put("serviceName", serviceName);
}
if (null != className) {
document.put("className", className);
}
documentCodec.encode(writer, document, encoderContext);
}
and decoder:
public AuditLog decode(BsonReader reader, DecoderContext decoderContext) {
Document document = documentCodec.decode(reader, decoderContext);
System.out.println("document " + document);
AuditLog auditLog = new AuditLog();
auditLog.setLogId(document.getObjectId("_id"));
auditLog.setTimestamp(document.getDate("timestamp"));
auditLog.setDeviceId(document.getString("deviceId"));
auditLog.setUserId(document.getString("userId"));
auditLog.setRequestId(document.getString("requestId"));
auditLog.setOperationType(document.getString("operationType"));
auditLog.setMessage(document.getString("message"));
auditLog.setServiceName(document.getString("serviceName"));
auditLog.setClassName(document.getString("className"));
return auditLog;
}
and the way I an reading:
public void getAuthenticationEntries() {
Codec<Document> defaultDocumentCodec = MongoClient.getDefaultCodecRegistry().get(Document.class);
AuditLogCodec auditLogCodec = new AuditLogCodec(defaultDocumentCodec);
CodecRegistry codecRegistry = CodecRegistries.fromRegistries(MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromCodecs(auditLogCodec));
MongoClientOptions options = MongoClientOptions.builder().codecRegistry(codecRegistry).build();
MongoClient mc = new MongoClient("1.2.3.4:27017", options);
MongoCollection<AuditLog> collection = mc.getDatabase("myDB").getCollection("myCol",
AuditLog.class);
BasicDBObject neQuery = new BasicDBObject();
neQuery.put("myFiltr", new BasicDBObject("$eq", "mystuffr"));
FindIterable<AuditLog> cursor = collection.find(neQuery);
List<AuditLog> cleanList = new ArrayList<AuditLog>();
for (AuditLog object : cursor) {
System.out.println("timestamp: " + object.getTimestamp());
}
}
My pojo:
public class AuditLog implements Bson {
#Id
private ObjectId logId;
#JsonProperty("#timestamp")
private Date timestamp;
#JsonProperty("deviceId")
private String deviceId;
#JsonProperty("userId")
private String userId;
#JsonProperty("requestId")
private String requestId;
#JsonProperty("operationType")
private String operationType;
#JsonProperty("message")
private String message;
#JsonProperty("serviceName")
private String serviceName;
#JsonProperty("className")
private String className;

After a thorough research, I fixed the problem of returned null values. The mongoimport command was used to import the log files into Mongodbfrom elasticsearch. However, the time format was not converted to ISODate during the import operation. What I had to do was to update the time format to ISODate using the below command:
db.Collection.find().forEach(function (doc){
doc.time = Date(time);
});
db.dummy.save(doc);
Here is a related question that tackles a similar challenge.

Related

Replacement for "GROUP BY" in ContentResolver query in Android Q ( Android 10, API 29 changes)

I'm upgrading some legacy to target Android Q, and of course this code stop working:
String[] PROJECTION_BUCKET = {MediaStore.Images.ImageColumns.BUCKET_ID,
MediaStore.Images.ImageColumns.BUCKET_DISPLAY_NAME,
MediaStore.Images.ImageColumns.DATE_TAKEN,
MediaStore.Images.ImageColumns.DATA,
"COUNT(" + MediaStore.Images.ImageColumns._ID + ") AS COUNT",
MediaStore.Files.FileColumns.MEDIA_TYPE,
MediaStore.MediaColumns._ID};
String BUCKET_GROUP_BY = " 1) and " + BUCKET_WHERE.toString() + " GROUP BY 1,(2";
cur = context.getContentResolver().query(images, PROJECTION_BUCKET,
BUCKET_GROUP_BY, null, BUCKET_ORDER_BY);
android.database.sqlite.SQLiteException: near "GROUP": syntax error (code 1 SQLITE_ERROR[1])
Here it supposed to obtain list of images with album name, date, count of pictures - one image for each album, so we can create album picker screen without querying all pictures and loop through it to create albums.
Is it possible to group query results with contentResolver since SQL queries stoped work?
(I know that ImageColumns.DATA and "COUNT() AS COUNT" are deprecated too, but this is a question about GROUP BY)
(There is a way to query albums and separately query photo, to obtain photo uri for album cover, but i want to avoid overheads)
Unfortunately Group By is no longer supported in Android 10 and above, neither any aggregated functions such as COUNT. This is by design and there is no workaround.
The solution is what you are actually trying to avoid, which is to query, iterate, and get metrics.
To get you started you can use the next snipped, which will resolve the buckets (albums), and the amount of records in each one.
I haven't added code to resolve the thumbnails, but is easy. You must perform a query for each bucket Id from all the Album instances, and use the image from the first record.
public final class AlbumQuery
{
#NonNull
public static HashMap<String, AlbumQuery.Album> get(#NonNull final Context context)
{
final HashMap<String, AlbumQuery.Album> output = new HashMap<>();
final Uri contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
final String[] projection = {MediaStore.Images.Media.BUCKET_DISPLAY_NAME, MediaStore.Images.Media.BUCKET_ID};
try (final Cursor cursor = context.getContentResolver().query(contentUri, projection, null, null, null))
{
if ((cursor != null) && (cursor.moveToFirst() == true))
{
final int columnBucketName = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.BUCKET_DISPLAY_NAME);
final int columnBucketId = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.BUCKET_ID);
do
{
final String bucketId = cursor.getString(columnBucketId);
final String bucketName = cursor.getString(columnBucketName);
if (output.containsKey(bucketId) == false)
{
final int count = AlbumQuery.getCount(context, contentUri, bucketId);
final AlbumQuery.Album album = new AlbumQuery.Album(bucketId, bucketName, count);
output.put(bucketId, album);
}
} while (cursor.moveToNext());
}
}
return output;
}
private static int getCount(#NonNull final Context context, #NonNull final Uri contentUri, #NonNull final String bucketId)
{
try (final Cursor cursor = context.getContentResolver().query(contentUri,
null, MediaStore.Images.Media.BUCKET_ID + "=?", new String[]{bucketId}, null))
{
return ((cursor == null) || (cursor.moveToFirst() == false)) ? 0 : cursor.getCount();
}
}
public static final class Album
{
#NonNull
public final String buckedId;
#NonNull
public final String bucketName;
public final int count;
Album(#NonNull final String bucketId, #NonNull final String bucketName, final int count)
{
this.buckedId = bucketId;
this.bucketName = bucketName;
this.count = count;
}
}
}
This is a more efficient(not perfect) way to do that.
I am doing it for videos, but doing so is the same for images to. just change MediaStore.Video.Media.X to MediaStore.Images.Media.X
public class QUtils {
/*created by Nasib June 6, 2020*/
#RequiresApi(api = Build.VERSION_CODES.Q)
public static ArrayList<FolderHolder> loadListOfFolders(Context context) {
ArrayList<FolderHolder> allFolders = new ArrayList<>();//list that we need
HashMap<Long, String> folders = new HashMap<>(); //hashmap to track(no duplicates) folders by using their ids
String[] projection = {MediaStore.Video.Media._ID,
MediaStore.Video.Media.BUCKET_ID,
MediaStore.Video.Media.BUCKET_DISPLAY_NAME,
MediaStore.Video.Media.DATE_ADDED};
ContentResolver CR = context.getContentResolver();
Uri root = MediaStore.Video.Media.getContentUri(MediaStore.VOLUME_EXTERNAL);
Cursor c = CR.query(root, projection, null, null, MediaStore.Video.Media.DATE_ADDED + " desc");
if (c != null && c.moveToFirst()) {
int folderIdIndex = c.getColumnIndexOrThrow(MediaStore.Video.Media.BUCKET_ID);
int folderNameIndex = c.getColumnIndexOrThrow(MediaStore.Video.Media.BUCKET_DISPLAY_NAME);
int thumbIdIndex = c.getColumnIndexOrThrow(MediaStore.Video.Media._ID);
int dateAddedIndex = c.getColumnIndexOrThrow(MediaStore.Video.Media.DATE_ADDED);
do {
Long folderId = c.getLong(folderIdIndex);
if (folders.containsKey(folderId) == false) { //proceed only if the folder data has not been inserted already :)
long thumbId = c.getLong(thumbIdIndex);
String folderName = c.getString(folderNameIndex);
String dateAdded = c.getString(dateAddedIndex);
Uri thumbPath = ContentUris.withAppendedId(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, thumbId);
folders.put(folderId, folderName);
allFolders.add(new FolderHolder(String.valueOf(thumbPath), folderName, dateAdded));
}
} while (c.moveToNext());
c.close(); //close cursor
folders.clear(); //clear the hashmap becuase it's no more useful
}
return allFolders;
}
}
FolderHolder model class
public class FolderHolder {
private String folderName;
public long dateAdded;
private String thumbnailPath;
public long folderId;
public void setPath(String thumbnailPath) {
this.thumbnailPath = thumbnailPath;
}
public String getthumbnailPath() {
return thumbnailPath;
}
public FolderHolder(long folderId, String thumbnailPath, String folderName, long dateAdded) {
this.folderId = folderId;
this.folderName = folderName;
this.thumbnailPath = thumbnailPath;
this.dateAdded = dateAdded;
}
public String getFolderName() {
return folderName;
}
}
GROUP_BY supporting in case of using Bundle:
val bundle = Bundle().apply {
putString(
ContentResolver.QUERY_ARG_SQL_SORT_ORDER,
"${MediaStore.MediaColumns.DATE_MODIFIED} DESC"
)
putString(
ContentResolver.QUERY_ARG_SQL_GROUP_BY,
MediaStore.Images.ImageColumns.BUCKET_ID
)
}
contentResolver.query(
uri,
arrayOf(
MediaStore.Images.ImageColumns.BUCKET_ID,
MediaStore.Images.ImageColumns.BUCKET_DISPLAY_NAME,
MediaStore.Images.ImageColumns.DATE_TAKEN,
MediaStore.Images.ImageColumns.DATA
),
bundle,
null
)

Database update without data loss. FATAL EXCEPTION: ModernAsyncTask #1

I need to implement an update of the database lying in the assets. User data, namely, in the "favorite" record or not, should be saved.
I already asked a question and they helped me -https://stackoverflow.com/a/53827525/10261947
Everything worked in a test application. But when I transferred the code (exactly the same) to the real application, an error occurs - E/AndroidRuntime: FATAL EXCEPTION: ModernAsyncTask #1
Process: rodionova.lyubov.brodsky, PID: 4196
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.support.v4.content.ModernAsyncTask$3.done(ModernAsyncTask.java:161)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:383)
at java.util.concurrent.FutureTask.setException(FutureTask.java:252)
at java.util.concurrent.FutureTask.run(FutureTask.java:271)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1162)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:636)
at java.lang.Thread.run(Thread.java:784)
Caused by: java.lang.IllegalArgumentException: the bind value at index 4 is null
at android.database.sqlite.SQLiteProgram.bindString(SQLiteProgram.java:169)
at android.database.sqlite.SQLiteProgram.bindAllArgsAsStrings(SQLiteProgram.java:205)
at android.database.sqlite.SQLiteDirectCursorDriver.query(SQLiteDirectCursorDriver.java:47)
at android.database.sqlite.SQLiteDatabase.rawQueryWithFactory(SQLiteDatabase.java:1397)
at android.database.sqlite.SQLiteDatabase.queryWithFactory(SQLiteDatabase.java:1239)
at android.database.sqlite.SQLiteDatabase.query(SQLiteDatabase.java:1110)
at android.database.sqlite.SQLiteDatabase.query(SQLiteDatabase.java:1278)
at rodionova.lyubov.brodsky.db.PoemsDbHelper.insertCorePoem(PoemsDbHelper.java:121)
at rodionova.lyubov.brodsky.db.PoemsDbHelper.getNewPoems(PoemsDbHelper.java:90)
at rodionova.lyubov.brodsky.db.PoemsDbHelper.onUpgrade(PoemsDbHelper.java:41)
at com.readystatesoftware.sqliteasset.SQLiteAssetHelper.getWritableDatabase(SQLiteAssetHelper.java:197)
at com.readystatesoftware.sqliteasset.SQLiteAssetHelper.getReadableDatabase(SQLiteAssetHelper.java:254)
at rodionova.lyubov.brodsky.db.PoemsProvider.query(PoemsProvider.java:45)
at android.content.ContentProvider.query(ContentProvider.java:1057)
If you do not perform the update, the application is working properly, so I will post only the code DbHelper
public class PoemsDbHelper extends SQLiteAssetHelper {
public static final String DB_NAME = "brodsky.db";
public static final int DBVERSION = 3;
public static final String TBLNAME = "poems_table";
public static final String COL_ID = "id";
public static final String COL_TITLE = "title";
public static final String COl_POEM = "poem";
public static final String COL_SUBJECT = "subject";
public static final String COL_YEARS = "years";
public static final String COL_FAVOURITE = "favorite";
Context mContext;
public PoemsDbHelper(Context context) {
super(context, DB_NAME, null, DBVERSION);
mContext = context;
}
#Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
if(newVersion > oldVersion)
getNewPoems(mContext, db);
}
private void getNewPoems(Context context, SQLiteDatabase db) {
InputStream is;
OutputStream os;
final String tempNewDbName = "temp_brodsky.db";
int buffersize = 4096;
byte[] buffer = new byte[buffersize];
String newDBPath = mContext.getDatabasePath(tempNewDbName).getPath();
File newDBFile = new File(newDBPath);
if (newDBFile.exists()) {
newDBFile.delete();
}
File newDBFileDirectory = newDBFile.getParentFile();
if (!newDBFileDirectory.exists()) {
newDBFileDirectory.mkdirs();
}
try {
is = context.getAssets().open("databases/" + DB_NAME);
os = new FileOutputStream(newDBFile);
int bytes_read;
while ((bytes_read = is.read(buffer,0,buffersize)) > 0) {
os.write(buffer);
}
os.flush();
os.close();
is.close();
}catch (IOException e) {
e.printStackTrace();
throw new RuntimeException("Ouch updated database not copied - processing stopped - see stack-trace above.");
}
long id = maxid(db) + 1;
SQLiteDatabase newdb = SQLiteDatabase.openDatabase(newDBFile.getPath(),null,SQLiteDatabase.OPEN_READONLY);
Cursor csr = newdb.query(TBLNAME,null,null,null,null,null,null);
long insert_result;
db.beginTransaction();
while (csr.moveToNext()) {
insert_result = insertCorePoem(
db,
id,
csr.getString(csr.getColumnIndex(COL_TITLE)),
csr.getString(csr.getColumnIndex(COl_POEM)),
csr.getString(csr.getColumnIndex(COL_SUBJECT)),
csr.getString(csr.getColumnIndex(COL_YEARS)),
csr.getString(csr.getColumnIndex(COL_FAVOURITE))
);
if (insert_result > 0) {
id++;
}
}
db.setTransactionSuccessful();
db.endTransaction();
csr.close();
newDBFile.delete();
}
public long insertCorePoem(SQLiteDatabase db, long id, String title, String poem, String subject, String years, String favourite) {
String whereclause = COL_TITLE + "=? AND " + COl_POEM + "=? AND " + COL_SUBJECT + "=? AND " + COL_YEARS + "=?";
String[] whereargs = new String[]{
title,
poem,
subject,
years
};
Cursor csr = db.query(TBLNAME,null,whereclause,whereargs,null,null,null);
boolean rowexists = (csr.getCount() > 0);
csr.close();
if (rowexists) {
Log.d("INSERTCOREPOEM","Skipping insert of row");
return -2;
}
ContentValues cv = new ContentValues();
cv.put(COL_ID,id);
cv.put(COL_TITLE,title);
cv.put(COl_POEM,poem);
cv.put(COL_SUBJECT,subject);
cv.put(COL_YEARS,years);
cv.put(COL_FAVOURITE,favourite);
Log.d("INSERTCOREPOEM","Inserting new column with id " + String.valueOf(id));
return db.insert(TBLNAME, null, cv);
}
private long maxid(SQLiteDatabase db) {
long rv = 0;
String extractcolumn = "maxid";
String[] col = new String[]{"max(" + COL_ID + ") AS " + extractcolumn};
Cursor csr = db.query(TBLNAME,col,null,null,null,null,null);
if (csr.moveToFirst()) {
rv = csr.getLong(csr.getColumnIndex(extractcolumn));
}
csr.close();
return rv;
}
}
I do not understand what is wrong. Identical code works great friend application. I would be grateful for the help.
Your issue is that you likely have a value of null in the years column of a row or rows in the updated database that data is being copied from.
Although you could change the code to handle (skip insertion or use provide a year value) the end result may not be desired. So the most likely fix would be to amend the database to have valid/useful year values.

Update Query with annotation using Spring and MongoRepository

I am using the latest version of Spring Boot and Spring Data MongoRepository. I have written a custom repository interface
public interface CompanyRepository extends MongoRepository<Company, String>{
#Query(value = "{ 'employer.userId' : ?0 }")
Company findByCompanyUserUserId(String userId);
}
In the same way i want to use #Query annotation for updating a particular field. can someone suggest me?
Create an annotation like this:
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD})
public #interface MongoUpdate {
String find() default "{}";
String update() default "{}";
String collection();
boolean multi() default false;
}
And an aspect like this:
#Aspect
#Component
#SuppressWarnings("unchecked")
public class MongoUpdateAspect {
private static final Logger logger = LoggerFactory.getLogger(MongoUpdateAspect.class);
#Autowired
private MongoTemplate mongoTemplate;
#Pointcut("#annotation(com.ofb.commons.aop.common.MongoUpdate)")
public void pointCut() {
}
#Around("com.ofb.commons.aspect.MongoUpdateAspect.pointCut() && #annotation(mongoUpdate)")
public Object applyQueryUpdate(ProceedingJoinPoint joinPoint, MongoUpdate mongoUpdate) throws Throwable {
Object[] args = joinPoint.getArgs();
String findQuery = mongoUpdate.find();
String updateQuery = mongoUpdate.update();
String collection = mongoUpdate.collection();
boolean multiUpdate = mongoUpdate.multi();
for (int i = 0; i < args.length; i++) {
if (args[i] instanceof Collection) {
Collection collection1 = (Collection) args[i];
String replaceStr = (String) collection1.stream().map(object -> {
if (object instanceof Number) {
return object.toString();
} else {
return String.format("\"%s\"", object.toString());
}
}).collect(Collectors.joining(","));
findQuery = findQuery.replace(String.format("?%s", i), replaceStr);
updateQuery = updateQuery.replace(String.format("?%s", i), replaceStr);
} else if (args[i] instanceof Object[]) {
Object[] objects = (Object[]) args[i];
String replaceStr = Arrays.stream(objects).map(object -> {
if (object instanceof Number) {
return object.toString();
} else {
return String.format("\"%s\"", object.toString());
}
}).collect(Collectors.joining(","));
findQuery = findQuery.replace(String.format("?%s", i), replaceStr);
updateQuery = updateQuery.replace(String.format("?%s", i), replaceStr);
} else {
if (args[i] instanceof Number) {
findQuery = findQuery.replace(String.format("?%s", i), args[i].toString());
updateQuery.replace(String.format("?%s", i), args[i].toString());
} else {
findQuery = findQuery.replace(String.format("?%s", i), String.format("\"%s\"", args[i].toString()));
updateQuery =
updateQuery.replace(String.format("?%s", i), String.format("\"%s\"", args[i].toString()));
}
}
}
Query query = new BasicQuery(findQuery);
Update update = new BasicUpdate(updateQuery);
if (multiUpdate) {
mongoTemplate.updateMulti(query, update, collection);
} else {
mongoTemplate.updateFirst(query, update, collection);
}
return null;
}
}
This will not work in MongoRepository implemented interfaces but you can create an empty bodied method in your service layer
#MongoUpdate(find = {}, update = "{$push : {'offFeatures' : ?0}}", collection = "userPreference", multi = true)
public void offFeatures(String feature) {
}
It's a reasonable question. Assuming that you're using the org.springframework.data.mongodb.repository.MongoRepository class, can you not simply use the insert(..) or save(..) methods for what you need?
API docs

org.hibernate.search.bridge.BridgeException: Exception while calling bridge#objectToString

I am able to insert record and able index them but i am facing an exception while searching
org.hibernate.search.bridge.BridgeException: Exception while calling bridge#objectToString
class: com.edoors.formBean.Hib_cons_Cv
path: cons_cv
I am able search on all coulmn of table except blob column
Field Bridge
public class ByteArrayBridge implements TwoWayStringBridge {
public String objectToString(Object object) {
byte[] data = (byte[]) object;
StringWriter writer = new StringWriter();
InputStream is = null;
try {
is = new ByteArrayInputStream(data);
new AutoDetectParser().parse(is,new WriteOutContentHandler(writer),new Metadata(),new
ParseContext());
return is.toString();
} catch (Exception e) {
System.out.println("Exception "+e);
}
return writer.toString();
}
public Object stringToObject(String string) {
byte[] data=string.getBytes();
Object obj=data;
return obj;
}
}
DAO Class ::
public List searchConsultantByTitle(String jobtitle)
{
List list=null;
Session session = hiberUtil.openSession();
Transaction tx = null;
try{
tx = session.beginTransaction();
FullTextSession fullTextSession = Search.getFullTextSession(session);
QueryBuilder queryBuilder =
fullTextSession.getSearchFactory().buildQueryBuilder().forEntity(Hib_cons_Cv.class).get();
org.apache.lucene.search.Query luceneQuery = null;
luceneQuery =
queryBuilder.keyword().fuzzy().withThreshold(0.7f).onField("cons_cv").matching(jobtitle).createQuery();
FullTextQuery hibernateQuery = fullTextSession.createFullTextQuery(luceneQuery, Hib_cons_Cv.class);
int resultSize = hibernateQuery.getResultSize();
System.out.println(".....resultSize..............................."+resultSize);
}
catch(Exception e)
{
System.out.println(e.getMessage());
}
}
POJO Class
#Entity
#AnalyzerDef(name = "customanalyzer", tokenizer = #TokenizerDef(factory =
KeywordTokenizerFactory.class), filters = {
#TokenFilterDef(factory = LowerCaseFilterFactory.class),
#TokenFilterDef(factory = SnowballPorterFilterFactory.class, params = {
#Parameter(name = "language", value = "English") }) })
#Indexed
public class Hib_cons_Cv {
#Column(name = "cons_cv", unique = false, nullable = false, length = 59296)
#Lob
#Field(analyze = Analyze.NO, store = Store.YES)
#FieldBridge(impl = ByteArrayBridge.class)
private Blob cons_cv;
//setters and getters
}
I also got this error (with no stack trace). Turned out i put in the wrong field name. It was actually using a field with no bridging.

How to retrieve an embedded list of object of Entity?

I have a simple problem storing and retrieving an embedded collection of entity to mongo. I have checked theses question :
how to serialize class? and Mongodb saves list of object
what I understand is to save a list objects the class of that objects must extends ReflactionDBObject. This worked for saving the object, by retrieving it with the embedded collection does not work.
here a simple test show that retrieving embedded entities does not work !
#Test
public void whatWasStoredAsEmbeddedCollectionIsRetrieved2() {
BasicDBObject country = new BasicDBObject();
country.put("name", "Bulgaria");
List<City> cities = Lists.newArrayList(new City("Tarnovo"));
country.put("listOfCities", cities);
DBCollection collection = db().get().getCollection("test_Collection");
collection.save(country);
DBCursor object = collection.find(new BasicDBObject().append("name", "Bulgaria"));
DBObject returnedCity = object.next();
DBObject embeddedCities = (DBObject) returnedCity.get("listOfCities");
System.out.println(embeddedCities);
}
Here is the City Class
class City extends ReflectionDBObject {
String name;
City() {
}
City(String name) {
this.name = name;
}
public String getName() {
return name;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (!(o instanceof City)) return false;
City city = (City) o;
if (name != null ? !name.equals(city.name) : city.name != null) return false;
return true;
}
#Override
public int hashCode() {
return name != null ? name.hashCode() : 0;
}
#Override
public String toString() {
return "City{" +
"name='" + name + '\'' +
'}';
}
}
The out put of the System.out.println statement is [ { "_id" : null }]
Now how can get back the embedded object and the embedded list in it ?
If you do not have a requirement to define your own class City, you can define subdocuments using the BasicDBObjects. I only added the 'name' field to the citySubDoc1 and citySubDoc2, but of course, you can add more fields to these subdocuments.
// Define subdocuments
BasicDBObject citySubDoc1 = new BasicDBObject();
citySubDoc1.put("name", "Tarnovo");
BasicDBObject citySubDoc2 = new BasicDBObject();
citySubDoc2.put("name", "Sofia");
// add to list
List<DBObject> cities = new ArrayList <DBObject>();
cities.add(citySubDoc1);
cities.add(citySubDoc2);
country.put("listOfCities", cities);
collection.save(country);
// Specify query condition
BasicDBObject criteriaQuery = new BasicDBObject();
criteriaQuery.put("name", "Bulgaria");
// Perform the read
DBCursor cursor = collection.find(criteriaQuery);
// Loop through the results
try {
while (cursor.hasNext()) {
List myReturnedListOfCities = (List) cursor.next().get("listOfCities");
System.out.println(myReturnedListOfCities);
}
} finally {
cursor.close();
}