Exporting Records from Acumatica via Screen-Based API - soap

This topic will demonstrate how to export records from Acumatica ERP via the Screen-Based API. The Screen-Based API of Acumatica ERP provides only the SOAP interface. If your development platform has limited support for SOAP web services, consider the Contract-Based API providing both SOAP and REST interfaces. For more information on the Screen-Based API, see Acumatica ERP Documentation

Data Export from an Entry Form with a Single Primary Key
The Stock Items screen (IN.20.25.00) is one of the most often used data entry forms of Acumatica ERP to export data. Inventory ID is the only primary key on the Stock Items screen:
To export records from a data entry form, your SOAP request must always begin with the ServiceCommands.Every[Key] command, where [Key] is to be replaced with primary key name.
To export all stock items in a single web service call:
Screen context = new Screen();
context.CookieContainer = new System.Net.CookieContainer();
context.Url = "http://localhost/AcumaticaERP/Soap/IN202500.asmx";
context.Login(username, password);
try
{
Content stockItemsSchema = PX.Soap.Helper.GetSchema<Content>(context);
Field lastModifiedField = new Field
{
ObjectName = stockItemsSchema.StockItemSummary.InventoryID.ObjectName,
FieldName = "LastModifiedDateTime"
};
var commands = new Command[]
{
stockItemsSchema.StockItemSummary.ServiceCommands.EveryInventoryID,
stockItemsSchema.StockItemSummary.InventoryID,
stockItemsSchema.StockItemSummary.Description,
stockItemsSchema.GeneralSettingsItemDefaults.ItemClass,
stockItemsSchema.GeneralSettingsUnitOfMeasureBaseUnit.BaseUnit,
lastModifiedField
};
var items = context.Export(commands, null, 0, false, false);
}
finally
{
context.Logout();
}
With time amount of data in any ERP application tends to grow in size. If you will be exporting all records from your Acumatica ERP instance in a single web service call, very soon you might notice timeout errors. Increasing timeout is a possible one-time, but not very good long-term solution. Your best option to address this challenge is to export stock items in batches of several records.
To export stock items in batches of 10 records:
Screen context = new Screen();
context.CookieContainer = new System.Net.CookieContainer();
context.Url = "http://localhost/AcumaticaERP/Soap/IN202500.asmx";
context.Login(username, password);
try
{
Content stockItemsSchema = PX.Soap.Helper.GetSchema<Content>(context);
Field lastModifiedField = new Field
{
ObjectName = stockItemsSchema.StockItemSummary.InventoryID.ObjectName,
FieldName = "LastModifiedDateTime"
};
var commands = new Command[]
{
stockItemsSchema.StockItemSummary.ServiceCommands.EveryInventoryID,
stockItemsSchema.StockItemSummary.InventoryID,
stockItemsSchema.StockItemSummary.Description,
stockItemsSchema.GeneralSettingsItemDefaults.ItemClass,
stockItemsSchema.GeneralSettingsUnitOfMeasureBaseUnit.BaseUnit,
lastModifiedField
};
var items = context.Export(commands, null, 10, false, false);
while (items.Length == 10)
{
var filters = new Filter[]
{
new Filter
{
Field = stockItemsSchema.StockItemSummary.InventoryID,
Condition = FilterCondition.Greater,
Value = items[items.Length - 1][0]
}
};
items = context.Export(commands, filters, 10, false, false);
}
}
finally
{
context.Logout();
}
There are 2 main differences between the single call approach and the export in batches:
topCount parameter of the Export command was always set to 0 in the single call approach
when exporting records in batches, size of a batch is configured though the topCount parameter supplemented by the Filter array to request the next result set
Data Export from an Entry Form with a Composite Primary Key
The Sales Orders screen (SO.30.10.00) is a perfect example of a data entry form with a composite primary key. The primary key on the Sales Orders screen is composed by the Order Type and the Order Number:
The recommended 2-step strategy to export data from the Sales Orders screen or any other data entry form with a composite primary key via the Screen-Based API:
on step 1 you request all types of orders previously created in your Acumatica ERP application
2nd step is to export orders of each type independently either in a single call or in batches
To request all types of existing orders:
Screen context = new Screen();
context.CookieContainer = new System.Net.CookieContainer();
context.Url = "http://localhost/AcumaticaERP/Soap/SO301000.asmx";
context.Login(username, password);
try
{
Content orderSchema = PX.Soap.Helper.GetSchema<Content>(context);
var commands = new Command[]
{
orderSchema.OrderSummary.ServiceCommands.EveryOrderType,
orderSchema.OrderSummary.OrderType,
};
var types = context.Export(commands, null, 1, false, false);
}
finally
{
context.Logout();
}
In the SOAP call above, notice topCount parameter of the Export command set to 1. The purpose of this request is only to get all types of orders previously created in your Acumatica ERP application, not to export data.
To export records of each type independently in batches:
Screen context = new Screen();
context.CookieContainer = new System.Net.CookieContainer();
context.Url = "http://localhost/AcumaticaERP/Soap/SO301000.asmx";
context.Login(username, password);
try
{
Content orderSchema = PX.Soap.Helper.GetSchema<Content>(context);
var commands = new Command[]
{
orderSchema.OrderSummary.ServiceCommands.EveryOrderType,
orderSchema.OrderSummary.OrderType,
};
var types = context.Export(commands, null, 1, false, false);
for (int i = 0; i < types.Length; i++)
{
commands = new Command[]
{
new Value
{
LinkedCommand = orderSchema.OrderSummary.OrderType,
Value = types[i][0]
},
orderSchema.OrderSummary.ServiceCommands.EveryOrderNbr,
orderSchema.OrderSummary.OrderType,
orderSchema.OrderSummary.OrderNbr,
orderSchema.OrderSummary.Customer,
orderSchema.OrderSummary.CustomerOrder,
orderSchema.OrderSummary.Date,
orderSchema.OrderSummary.OrderedQty,
orderSchema.OrderSummary.OrderTotal
};
var orders = context.Export(commands, null, 100, false, false);
while (orders.Length == 100)
{
var filters = new Filter[]
{
new Filter
{
Field = orderSchema.OrderSummary.OrderNbr,
Condition = FilterCondition.Greater,
Value = orders[orders.Length - 1][1]
}
};
orders = context.Export(commands, filters, 100, false, false);
}
}
}
finally
{
context.Logout();
}
The sample above demonstrates how to export all sales orders from Acumatica ERP in batches of 100 records. To export sales order of each type independently, your SOAP request must always begin with the Value command, which determines the type of orders to be exported. After the Value command used to set first key value goes the ServiceCommands.Every[Key] command, where [Key] is to be replaced with name of the second key.
To export records of a specific type:
In case you need to export sales orders of a specific type, it's possible to explicitly define the type of orders with the Value command in the beginning of your SOAP request followed by the single call approach or the export in batches.
To export all sales order of the IN type in one call:
Screen context = new Screen();
context.CookieContainer = new System.Net.CookieContainer();
context.Url = "http://localhost/AcumaticaERP/Soap/SO301000.asmx";
context.Login(username, password);
try
{
Content orderSchema = PX.Soap.Helper.GetSchema<Content>(context);
var commands = new Command[]
{
new Value
{
LinkedCommand = orderSchema.OrderSummary.OrderType,
Value = "IN"
},
orderSchema.OrderSummary.ServiceCommands.EveryOrderNbr,
orderSchema.OrderSummary.OrderType,
orderSchema.OrderSummary.OrderNbr,
orderSchema.OrderSummary.Customer,
orderSchema.OrderSummary.CustomerOrder,
orderSchema.OrderSummary.Date,
orderSchema.OrderSummary.OrderedQty,
orderSchema.OrderSummary.OrderTotal
};
var orders = context.Export(commands, null, 0, false, false);
}
finally
{
context.Logout();
}

Related

Bulk Operations - Adding multiple rows to sheet

I'm attempting to write data from a data table to a sheet via the smartsheet API (using c# SDK). I have looked at the documentation and I see that it supports bulk operations but I'm struggling with finding an example for that functionality.
I've attempted to do a work around and just loop through each record from my source and post that data.
//Get column properties (column Id ) for existing smartsheet and add them to List for AddRows parameter
//Compare to existing Column names in Data table for capture of related column id
var columnArray = getSheet.Columns;
foreach (var column in columnArray)
{
foreach (DataColumn columnPdiExtract in pdiExtractDataTable.Columns)
{
//Console.WriteLine(columnPdiExtract.ColumnName);
if(column.Title == columnPdiExtract.ColumnName)
{
long columnIdValue = column.Id ?? 0;
//addColumnArrayIdList.Add(columnIdValue);
addColumnArrayIdList.Add(new KeyValuePair<string, long>(column.Title,columnIdValue));
}
}
}
foreach(var columnTitleIdPair in addColumnArrayIdList)
{
Console.WriteLine(columnTitleIdPair.Key);
var results = from row in pdiExtractDataTable.AsEnumerable() select row.Field<Double?>(columnTitleIdPair.Key);
foreach (var record in results)
{
Cell[] cells = new Cell[]
{
new Cell
{
ColumnId = columnTitleIdPair.Value,
Value = record
}
};
cellRecords = cells.ToList();
cellRecordsInsert.Add(cellRecords);
}
Row rows = new Row
{
ToTop = true,
Cells = cellRecords
};
IList<Row> newRows = smartsheet.SheetResources.RowResources.AddRows(sheetId, new Row[] { rows });
}
I expected to generate a value for each cell, append that to the list and then post it through the Row Object. However, my loop is appending the column values as such: A1: 1, B2: 2, C3: 3 instead of A1: 1, B1: 2, C3: 3
The preference would be to use bulk operations, but without an example I'm a bit at a loss. However, the loop isn't working out either so if anyone has any suggestions I would be very grateful!
Thank you,
Channing
Have you seen the Smartsheet C# sample read / write sheet? That may be a useful reference. It contains an example use of bulk operations that updates multiple rows with a single call.
Taylor,
Thank you for your help. You lead me in the right direction and I figured my way through a solution.
I grouped by my column value list and built records for the final bulk operation. I used a For loop but the elements in each grouping of columns is cleaned and assigned a 0 prior to this method so that they retain the same count of values per grouping.
// Pair column and cell values for row building - match
// Data source column title names with Smartsheet column title names
List<Cell> pairedColumnCells = new List<Cell>();
//Accumulate cells
List<Cell> cellsToImport = new List<Cell>();
//Accumulate rows for additions here
List<Row> rowsToInsert = new List<Row>();
var groupByCells = PairDataSourceAndSmartsheetColumnToGenerateCells(
sheet,
dataSourceDataTable).GroupBy(
c => c.ColumnId,
c => c.Value,
(key, g) => new {
ColumnId = key, Value = g.ToList<object>()
});
var countGroupOfCells = groupByCells.FirstOrDefault().Value.Count();
for (int i = 0; i <= countGroupOfCells - 1; i++)
{
foreach (var groupOfCells in groupByCells)
{
var cellListEelement = groupOfCells.Value.ElementAt(i);
var cellToAdd = new Cell
{
ColumnId = groupOfCells.ColumnId,
Value = cellListEelement
};
cellsToImport.Add(cellToAdd);
}
Row rows = new Row
{
ToTop = true,
Cells = cellsToImport
};
rowsToInsert.Add(rows);
cellsToImport = new List<Cell>();
}
return rowsToInsert;

Firebase: How to retrieve data from two tables using id

I come from an SQL background and recently started using Firebase for building an ionic shopping cart. This is the database schema:
To retrieve a user's cart, i used the following
var user_id="user1"; // Temporary initialised
var refCart = new Firebase("https://testing.firebaseio.com/cart");
var cart=$firebase(fireBaseData.refCart().child(user_id)).$asArray();
This gives the result:
item1 1
item2 2
item3 5
So tried using foreach()
var refMenu = new Firebase("https://testing.firebaseio.com/menu");
refCart.child(user_id).on("value", function(snapshot) {
snapshot.forEach(function(childSnapshot) {
var item_id = childSnapshot.name();
var qty = childSnapshot.val();
//var data= refMenu.child(item_id).val();
// val is not a function
//var data= refMenu.child(item_id).exportval();
// exportval is not a function
//var data = $firebase (refMenu.child(item_id)). $asArray();
// Give me an array of objects , ie attributes - OK! But what to do next ?
//console.log("DATA",data );
console.log("Item",item_id+" "+qty);
});
});
How can i use item_id to retrieve item details.
Is it the correct way of doing data retrieval from multiple tables?
Update:
Using on() function , i managed to get the item attributes.
refMenu.child(item_id).on("value", function(snapshot) {
console.log("Price",snapshot.val().price);
});
But is there any better implementation for the same.
Is there any better ways to retrieve (from the server side) specific attributes for the item.
Like in SQL
SELECT name,price, ... from menu;
NOTE: .on('event', callback) method will call your callback every time the event is fired.
If you need to retrieve data from a reference once, you should use: .once('event', callback)
NOTE2: snapshot.val() will give you a JSON object that you can assign to a variable
I would do it this way:
refCart.child(user_id).on("value", function(snapshot) {
snapshot.forEach(function(childSnapshot) {
var item_id = childSnapshot.name();
var qty = childSnapshot.val();
refMenu.child(item_id).once("value", function(snapshot) {
var item = snapshot.val()
console.log(item.name +' '+ item.price)
});
});
});
Hope it helps ;)

Meteor mongodb saves number as 'NaN'

I'm using the following method to add tasks to the mongo. However the 'rank' keeps being saved in the db as 'NaN'.
addTask: function (data) {
var data = data || {};
data.createdAt = new Date();
data.status = data.status || null;
data.owner = Meteor.userId();
var userID = Meteor.userId();
// Get the lowest rank for all non-checked tasks
minRank = Tasks.find({status: null}, {sort: {rank: 1}}).fetch();
data.rank = minRank.length > 0 ? minRank[0].rank - 1 : 0;
Tasks.insert(data);
}
I've used console.log to confirm data.rank is an number AND I've printed the rank on the page, which flashes briefly as the correct number in the UI before the server catches up to the client and changes it to NaN.
Any ideas?
Turns out Tasks.find() was returning different results on the server than it was on the client.
On the server it was return results for tasks owned by any user, where as the client only returned results owned by the current user because that's all it had access to.
In my case, the lowest-ranked result on the server side had no rank, so it came back as NaN.

How to implement search with multiple filters using lucene.net

I'm new to lucene.net. I want to implement search functionality on a client database. I have the following scenario:
Users will search for clients based on the currently selected city.
If the user wants to search for clients in another city, then he has to change the city and perform the search again.
To refine the search results we need to provide filters on Areas (multiple), Pincode, etc. In other words, I need the equivalent lucene queries to the following sql queries:
SELECT * FROM CLIENTS
WHERE CITY = N'City1'
AND (Area like N'%area1%' OR Area like N'%area2%')
SELECT * FROM CILENTS
WHERE CITY IN ('MUMBAI', 'DELHI')
AND CLIENTTYPE IN ('GOLD', 'SILVER')
Below is the code I've implemented to provide search with city as a filter:
private static IEnumerable<ClientSearchIndexItemDto> _search(string searchQuery, string city, string searchField = "")
{
// validation
if (string.IsNullOrEmpty(searchQuery.Replace("*", "").Replace("?", "")))
return new List<ClientSearchIndexItemDto>();
// set up Lucene searcher
using (var searcher = new IndexSearcher(_directory, false))
{
var hits_limit = 1000;
var analyzer = new StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
// search by single field
if (!string.IsNullOrEmpty(searchField))
{
var parser = new QueryParser(Lucene.Net.Util.Version.LUCENE_30, searchField, analyzer);
var query = parseQuery(searchQuery, parser);
var hits = searcher.Search(query, hits_limit).ScoreDocs;
var results = _mapLuceneToDataList(hits, searcher);
analyzer.Close();
searcher.Dispose();
return results;
}
else // search by multiple fields (ordered by RELEVANCE)
{
var parser = new MultiFieldQueryParser(Lucene.Net.Util.Version.LUCENE_30, new[]
{
"ClientId",
"ClientName",
"ClientTypeNames",
"CountryName",
"StateName",
"DistrictName",
"City",
"Area",
"Street",
"Pincode",
"ContactNumber",
"DateModified"
}, analyzer);
var query = parseQuery(searchQuery, parser);
var f = new FieldCacheTermsFilter("City",new[] { city });
var hits = searcher.Search(query, f, hits_limit, Sort.RELEVANCE).ScoreDocs;
var results = _mapLuceneToDataList(hits, searcher);
analyzer.Close();
searcher.Dispose();
return results;
}
}
}
Now I have to provide more filters on Area, Pincode, etc. in which Area is multiple. I tried BooleanQuery like below:
var cityFilter = new TermQuery(new Term("City", city));
var areasFilter = new FieldCacheTermsFilter("Area",areas); -- where type of areas is string[]
BooleanQuery filterQuery = new BooleanQuery();
filterQuery.Add(cityFilter, Occur.MUST);
filterQuery.Add(areasFilter, Occur.MUST); -- here filterQuery.Add not have an overloaded method which accepts string[]
If we perform the same operation with single area then it's working fine.
I've tried with ChainedFilter like below, which doesn't seems to satisfy the requirement. The below code performs or operation on city and areas. But the requirement is to perform OR operation between the areas provided in the given city.
var f = new ChainedFilter(new Filter[] { cityFilter, areasFilter });
Can anybody suggest to me how to achieve this in lucene.net? Your help will be appreciated.
You're looking for the BooleanFilter. Almost any query object has a matching filter object.
Look into TermsFilter (from Lucene.Net.Contrib.Queries) if your indexing doesn't match the requirements of FieldCacheTermsFilter. From the documentation of the later; "this filter requires that the field contains only a single term for all documents".
var cityFilter = new FieldCacheTermsFilter("CITY", new[] {"MUMBAI", "DELHI"});
var clientTypeFilter = new FieldCacheTermsFilter("CLIENTTYPE", new [] { "GOLD", "SILVER" });
var areaFilter = new TermsFilter();
areaFilter.AddTerm(new Term("Area", "area1"));
areaFilter.AddTerm(new Term("Area", "area2"));
var filter = new BooleanFilter();
filter.Add(new FilterClause(cityFilter, Occur.MUST));
filter.Add(new FilterClause(clientTypeFilter, Occur.MUST));
filter.Add(new FilterClause(areaFilter, Occur.MUST));
IndexSearcher searcher = null; // TODO.
Query query = null; // TODO.
Int32 hits_limit = 0; // TODO.
var hits = searcher.Search(query, filter, hits_limit, Sort.RELEVANCE).ScoreDocs;
What you are looking for is nested boolean queries so that you have an or (on your cities) but that whole group (matching the or) is itself matched as an and
filter1 AND filter2 AND filter3 AND (filtercity1 OR filtercity2 OR filtercity3)
There is already a good description of how to do this here:
How to create nested boolean query with lucene API (a AND (b OR c))?

EF 6 - many-to-many - Join table without duplicates

I'm using EF6 have some confusion on seeding a many to many relationship.
I have the following:
A User has many saved ChartQueries (that they can execute to get a chart).
A ChartQuery typically belongs to only one user, but there are several "shared" ChartQuerys that every User can execute. As a result I set up a many to many relationship using a join table UserChartQuery. The tables are up in the database just fine at 1-to-many on each side of the join table.
However, I'm not quite understanding how to seed or use this relationship. I don't want to end up with several duplicates of the "shared" ChartQuerys (a duplicate for each User). Instead, there should only be a single row for each "shared" ChartQuery that is a part of each User's SavedChartQueries collection (along with other, non-shared ChartQuerys that belong to that User only).
It seems like I'm forced to duplicate for each user:
var sharedChartQuery = new ChartQuery { ... };
var nonSharedChartQuery = new ChartQuery { ... };
var userOneChartQueryOne = new UserChartQuery { User = userOne, ChartQuery = sharedChartQuery };
var userTwoChartQueryOne = new UserChartQuery { User = userTwo, ChartQuery = sharedChartQuery };
var userTwoChartQueryTwo = new UserChartQuery { User = userTwo, ChartQuery = nonSharedChartQuery };
context.UserChartQueries.Add(userOneChartQueryOne);
context.UserChartQueries.Add(userOneChartQueryTwo);
context.UserChartQueries.Add(userTwoChartQueryTwo);
So first of all is this the right way to seed (through UserChartQueries table directly) or should I seed each User's SavedChartQueries navigation property?
And will this result in duplicate sharedChartQuery in the join table for each User? If so is there any way to avoid this?
Ok I understand how this works now. The following works as expected:
var userOne = new User {};
var userTwo = new User {};
var chartQuery = new ChartQuery { };
context.Users.Add(userOne);
context.Users.Add(userTwo);
context.UserChartQueries.Add(new UserChartQuery { User = userOne, ChartQuery = chartQuery });
context.UserChartQueries.Add(new UserChartQuery { User = userTwo, ChartQuery = chartQuery });
context.ChartQueries.Add(chartQuery);
The last line adds it to the table where the record actually resides. Checking the join table in SSMS shows that it only holds the foreign keys and nothing else. There are no duplicates.