split values of couch db result - gwt

I want to split the values selected from couch db in gwt according to a particular id. I tried to use the String tokenizer but couldn't found something useful. the result returned from couch db is in the following format:
{"_id":"2","_rev":"1-717f76046030a683687ace9ac8f7bdbf","course":"jbdgjbj","passwd":"rty","phone":"24125514444","clgnme":"bjfbjf","address":"jbjfbjb","name":"meenal","cpasswd":"rty","user":"2","fname":"jfbg"}
I just want to get the values and set them in textboxes.
how to do it?

try this ...
Session studentDbSession = new Session("localhost",5984);
Database studentCouchDb = studentDbSession.getDatabase("student");
System.out.println("select");
Document d = studentCouchDb.getDocument(input);
if(d.containsKey("FirstName")){
fname=d.getString("FirstName");//fname is variable
}
if(d.containsKey("LastName")){
lname=d.getString("LastName");//lname is variable
}
if(d.containsKey("Address")){
add= d.getString("Address");//add name is variable
}
finally concatenate all strings and return to client...
all=fname+" "+lname+" "+add;
return all;

This looks like JSON. You can parse it using JSONObject:
JSONObject obj = new JSONObject(yourString);
String course = obj.getString("course");

Related

Salesforce trigger-Not able to understand

Below is the code written by my collegue who doesnt work in the firm anymore. I am inserting records in object with data loader and I can see success message but I do not see any records in my object. I am not able to understand what below trigger is doing.Please someone help me understand as I am new to salesforce.
trigger DataLoggingTrigger on QMBDataLogging__c (after insert) {
Map<string,Schema.RecordTypeInfo> recordTypeInfo = Schema.SObjectType.QMB_Initial_Letter__c.getRecordTypeInfosByName();
List<QMBDataLogging__c> logList = (List<QMBDataLogging__c>)Trigger.new;
List<Sobject> sobjList = (List<Sobject>)Type.forName('List<'+'QMB_Initial_Letter__c'+'>').newInstance();
Map<string, QMBLetteTypeToVfPage__c> QMBLetteTypeToVfPage = QMBLetteTypeToVfPage__c.getAll();
Map<String,QMBLetteTypeToVfPage__c> mapofLetterTypeRec = new Map<String,QMBLetteTypeToVfPage__c>();
set<Id>processdIds = new set<Id>();
for(string key : QMBLetteTypeToVfPage.keyset())
{
if(!mapofLetterTypeRec.containsKey(key)) mapofLetterTypeRec.put(QMBLetteTypeToVfPage.get(Key).Letter_Type__c, QMBLetteTypeToVfPage.get(Key));
}
for(QMBDataLogging__c log : logList)
{
Sobject logRecord = (sobject)log;
Sobject QMBLetterRecord = new QMB_Initial_Letter__c();
if(mapofLetterTypeRec.containskey(log.Field1__c))
{
string recordTypeId = recordTypeInfo.get(mapofLetterTypeRec.get(log.Field1__c).RecordType__c).isAvailable() ? recordTypeInfo.get(mapofLetterTypeRec.get(log.Field1__c).RecordType__c).getRecordTypeId() : recordTypeInfo.get('Master').getRecordTypeId();
string fieldApiNames = mapofLetterTypeRec.containskey(log.Field1__c) ? mapofLetterTypeRec.get(log.Field1__c).FieldAPINames__c : '';
//QMBLetterRecord.put('Letter_Type__c',log.Name);
QMBLetterRecord.put('RecordTypeId',tgh);
processdIds.add(log.Id);
if(string.isNotBlank(fieldApiNames) && fieldApiNames.contains(','))
{
Integer i = 1;
for(string fieldApiName : fieldApiNames.split(','))
{
string logFieldApiName = 'Field'+i+'__c';
fieldApiName = fieldApiName.trim();
system.debug('fieldApiName=='+fieldApiName);
Schema.DisplayType fielddataType = getFieldType('QMB_Initial_Letter__c',fieldApiName);
if(fielddataType == Schema.DisplayType.Date)
{
Date dateValue = Date.parse(string.valueof(logRecord.get(logFieldApiName)));
QMBLetterRecord.put(fieldApiName,dateValue);
}
else if(fielddataType == Schema.DisplayType.DOUBLE)
{
string value = (string)logRecord.get(logFieldApiName);
Double dec = Double.valueOf(value.replace(',',''));
QMBLetterRecord.put(fieldApiName,dec);
}
else if(fielddataType == Schema.DisplayType.CURRENCY)
{
Decimal decimalValue = Decimal.valueOf((string)logRecord.get(logFieldApiName));
QMBLetterRecord.put(fieldApiName,decimalValue);
}
else if(fielddataType == Schema.DisplayType.INTEGER)
{
string value = (string)logRecord.get(logFieldApiName);
Integer integerValue = Integer.valueOf(value.replace(',',''));
QMBLetterRecord.put(fieldApiName,integerValue);
}
else if(fielddataType == Schema.DisplayType.DATETIME)
{
DateTime dateTimeValue = DateTime.valueOf(logRecord.get(logFieldApiName));
QMBLetterRecord.put(fieldApiName,dateTimeValue);
}
else
{
QMBLetterRecord.put(fieldApiName,logRecord.get(logFieldApiName));
}
i++;
}
}
}
sobjList.add(QMBLetterRecord);
}
if(!sobjList.isEmpty())
{
insert sobjList;
if(!processdIds.isEmpty()) DeleteDoAsLoggingRecords.deleteTheProcessRecords(processdIds);
}
Public static Schema.DisplayType getFieldType(string objectName,string fieldName)
{
SObjectType r = ((SObject)(Type.forName('Schema.'+objectName).newInstance())).getSObjectType();
DescribeSObjectResult d = r.getDescribe();
return(d.fields.getMap().get(fieldName).getDescribe().getType());
}
}
You might be looking in the wrong place. Check if there's an unit test written for this thing (there should be one, especially if it's deployed to production), it should help you understand how it's supposed to be used.
You're inserting records of QMBDataLogging__c but then it seems they're immediately deleted in DeleteDoAsLoggingRecords.deleteTheProcessRecords(processdIds). Whether whatever this thing was supposed to do succeeds or not.
This seems to be some poor man's CSV parser or generic "upload anything"... that takes data stored in QMBDataLogging__c and creates QMB_Initial_Letter__c out of it.
QMBLetteTypeToVfPage__c.getAll() suggests you could go to Setup -> Custom Settings, try to find this thing and examine. Maybe it has some values in production but in your sandbox it's empty and that's why essentially nothing works? Or maybe some values that are there are outdated?
There's some comparison if what you upload into Field1__c can be matched to what's in that custom setting. I guess you load some kind of subtype of your QMB_Initial_Letter__c in there. Record Type name and list of fields to read from your log record is also fetched from custom setting based on that match.
Then this thing takes what you pasted, looks at the list of fields in from the custom setting and parses it.
Let's say the custom setting contains something like
Name = XYZ, FieldAPINames__c = 'Name,SomePicklist__c,SomeDate__c,IsActive__c'
This thing will look at first record you inserted, let's say you have the CSV like that
Field1__c,Field2__c,Field3__c,Field4__c
XYZ,Closed,2022-09-15,true
This thing will try to parse and map it so eventually you create record that a "normal" apex code would express as
new QMB_Initial_Letter__c(
Name = 'XYZ',
SomePicklist__c = 'Closed',
SomeDate__c = Date.parse('2022-09-15'),
IsActive__c = true
);
It's pretty fragile, as you probably already know. And because parsing CSV is an art - I expect it to absolutely crash and burn when text with commas in it shows up (some text,"text, with commas in it, should be quoted",more text).
In theory admin can change mapping in setup - but then they'd need to add new field anyway to the loaded file. Overcomplicated. I guess somebody did it to solve issue with Record Type Ids - but there are better ways to achieve that and still have normal CSV file with normal columns and strong type matching, not just chucking everything in as strings.
In theory this lets you have "jagged" csv files (row 1 having 5 fields, row 2 having different record type and 17 fields? no problem)
Your call whether it's salvageable or you'd rather ditch it and try normal loading of QMB_Initial_Letter__c records. (get back to your business people and ask for requirements?) If you do have variable number of columns at source - you'd need to standardise it or group the data so only 1 "type" of records (well, whatever's in that "Field1__c") goes into each file.

Creating a LocalTime BsonDatetime object

I have a large logging and data entry program that receives DataTable object with 1 DataRow. The DataTable have random amount of columns, with random column name and type so i cannot have class for each. Using these DataColumn i get the DataType and build a BsonDocument from scratch out of this.
Here's a short example
public void ParseData(DataTable table)
{
// create the document
var document = new BsonDocument();
// get the only row in the table
var row = table.Rows[0];
// for each column we add the property
foreach (DataColumn column in table.Columns)
{
// create an empty value
BsonValue value = null;
// current column value
var columnValue = row[column.ColumnName];
// set the value based on the datatype
if (column.DataType == typeof(string)) value = new BsonString(columnValue.ToString());
else if (column.DataType == typeof(int)) value = new BsonInt32(Convert.ToInt32(columnValue));
else if (column.DataType == typeof(float)) value = new BsonDouble(Convert.ToDouble(columnValue));
else if (column.DataType == typeof(double)) value = new BsonDouble(Convert.ToDouble(columnValue));
else if (column.DataType == typeof(bool)) value = new BsonBoolean(Convert.ToBoolean(columnValue));
else if (column.DataType == typeof(DateTime)) value = new BsonDateTime(Convert.ToDateTime(columnValue));
// add the element
document.Add(new BsonElement(column.ColumnName, value));
}
// insert the document in the generic collection
InsertDocument(document);
}
As you can see it's pretty simple. I have removed a lot of types in the list as we have many custom types that might pass so i just kept the basic ones. The problem is that i cannot figure out how to force the BsonDateTime to save as local time in the collection. When doing filters with legacy apps it's not working. I need them to be saved as local time. It's never been an issue in the past but because of those legacy apps from the early 90's that still need support i have to figure something out.
I also need to reload them as local time. If i could, i would save them as string but i can't because since all columns are random i do not know when loading if a specific BsonString is really a string or if it's a DateTime. For reloading i must not reload really as local time. I must reload the exact value in the database. I only control the creation of the document. But reading i only control a few one's that will be reading from it that are in C#, Java and C++. The rest are legacy apps that companies doesn't even exist anymore.
I did try to just modify every single date that came in the system to account for UTC and change the date to when saved as UTC it's stored property and filters from legacy apps still works but all of .NET, Java and C++ apps load up the wrong value and not the written value.
Is there a way to just disable UTC in a specific collection or database in MongoDB directly like you can in SQL server ?
MongoDB stores times in UTC and does not have time zone support. You can store any values you like but they will be interpreted as UTC timestamps by most MongoDB-related software.

sqlite3 results set in swift returning extra data

I am trying to fetch a number of records from a sqlite3 database and load them into an array. The code I have written, which seems to function correctly at least as far as retrieving the correct number of records with the right values from the db is
while(results?.next() == true) {
println("Got a result")
var sname = results?.stringForColumn("surname")
var fname = results?.stringForColumn("firstname")
println("Retrieved \(sname) ,\(fname)")
}
The problem I have is that when I try to access the variables in the println statement what it yields is
Retrieved Optional("Smiles") ,Optional("Dick")
I have seemingly tried everything to get just the values but keep getting the Optional(" ") added. Any ideas?
Try this approach:
println("Retrieved \(sname!) ,\(fname!)")

how to Papa.unparse a huge JSON object

I'm using Papa.unparse() to convert a JSON object to csv then downloading the file. The method fails with:
"allocation size overflow papaparse.min.js:6:1580"
This happens in firefox when there's > than 500,000 items to unparse in the JSON array.
The Papa.parse() method allows you to stream data from a file. Is there any similar approach you can take for Papa.unparse()?
You don't need PapaParse to do this.
The allocation size overflow problem is from converting your 500,000 rows into 500,000 strings and concatenating them together to form one massive string to create the CSV file with. JavaScript creates a new String when you concatenate one or more together, and eventually you run out of memory and crash.
The solution is to use TextEncoder to encode your strings into utf-8 (or whatever you need) ArrayBuffers, push each one into a giant array, and then use that array to create your file.
Here's some rough code for how you might do that:
var textEncoder = new TextEncoder("utf-8");
var headers = ["header1","header2","header3"];
var row1 = ["column1-1","column1-2","column1-3"];
var row2 = ["column2-1","column-2-2","column2-3"];
var data = [headers,row1,row2];
var arrayBuffers = [];
var csvString = '';
var encodedString = null;
var file = null;
for(var x=0;x<data.length;x++){
csvString = data[x].join(",").concat('\r\n');
encodedString = textEncoder.encode(csvString);
arrayBuffers.push(encodedString);
}
file = new File(arrayBuffers,"yourCsvFile.csv",{ type: "text/csv" });
I use text-encoding for a TextEncoder polyfill, and presumably if you're trying to Papa.unparse you've got FileAPI already.

how to parse SMValue to String array

I am just getting familiar with the StackMob server side custom code sdk and i am trying to get a relationship field and iterate through it in the form of a String array. How do i do that ? is it possble to iterate through it without parsing it into an array?
DataService ds = serviceProvider.getDataService();
List<SMCondition> query = new ArrayList<SMCondition>();
query.add(new SMEquals("product_id", new SMString(jsonObj.getString("product_id"))));
List<SMObject> results = ds.readObjects("product", query);
SMObject product= results.get(0);
//product.getValue().get("categories"); how do i get this to be a String array?
At its simplest, that would look something like this:
List<SMValue> categories = (List<SMValue>)(rawObj.getValue().get("categories").getValue());
for (SMValue smString : categories) {
SMString stringValue = (SMString)smString.getValue();
//do whatever you want with the string value here
}
Obviously there are some unchecked casts in here, so you will want to add type/null checking to the appropriate sections depending on your data & schema.