invalid large object descriptor : 0 hibernate and postgres - postgresql

We have a n tier application where we are reading the BLOB objects stored in the postgres database.
At times when we are trying to access the blob object through input stream we get "org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 0" after reading the other blogs, this exception comes whenever we are trying to access the BLOB outside the transaction (transaction is committed).
But, in our case we get this exception even though the transaction is active. The BLOB is read within the transaction.
Any pointers as to why this exception is occuring even though the transaction is active?

Your description of the problem does not have specifics but in my code this error showed up when I tried to use Large Object outside the data access method. As in your case, the object was formed in the method. This is consistent with what other people noticed in this forum: Large Object exists only within the data access method (or transaction). I needed byte[] so I converted Large Object within the method, wrapped it up in a Data Transfer Object and was able to use it in other layers. This are relevant code snippets:
//This is Data Access Class
#Named
public class SupportDocsDAO {
protected ResultSet resultSet;
private LargeObject lob;
// SupportDocs is an Entity class in Data Transfer Objects package
private SupportDocs supportDocsDTO;
public LargeObject getLob() {
return lob;
}
public void setLob(LargeObject lob) {
this.lob = lob;
}
public SupportDocs getSupportDocsDTO() {
return supportDocsDTO;
}
public void setSupportDocsDTO(SupportDocs supportDocsDTO) {
this.supportDocsDTO = supportDocsDTO;
}
//.... other code
public SupportDocs fetchSupportDocForDescr(SupportDocs supportDocs1) {
Session session = HibernateUtil.getSessionFactory().openSession();
session.doWork(new Work() {
#Override
public void execute(java.sql.Connection connection) throws SQLException {
java.sql.PreparedStatement ps = null;
try {
LargeObjectManager lobm =
connection.unwrap(org.postgresql.PGConnection.class).getLargeObjectAPI();
ps = connection.prepareCall("{call ret_lo_supportdocs_id(?)}");
ps.setInt(1, supportDocs1.getSuppDocId());
ps.execute();
resultSet = ps.getResultSet();
while (resultSet.next()) {
supportDocsDTO.setFileNameDoc(resultSet.getString("filenamedoc"));
supportDocsDTO.setExtensionSd(resultSet.getString("extensionsd"));
long oid = resultSet.getLong("suppdoc_oid");
setLob(lobm.open(oid, LargeObjectManager.READ));
//This is the conversion of Large Object into byte[]
supportDocsDTO.setSuppDocImage(lob.read(lob.size()));
System.out.println("object size: " + lob.size());
}
// other code, catch, cleanup with finally, and return supportDocsDTO
This works without problems. I can recreate images and videos from obtained byte[].

Related

I have "Roach Motel Data" - data go into DB fine, cannot get back out

I'm using Xamarin.Forms with EF and SqLite. I've installed the "Microsoft.EntityFrameworkCore.Sqlite" Nuget package in my project. The code issue is in the shared code project, .NetStandard 2.0.
I have created a simple class, let's say CAT class to hold my DB table objects
I can use the "ensurecreated" command and that works fine
I can create a CAT, set properties and SaveChanges() to the DB; this works fine, I can see the data in the DB
I cannot get the data back out; I get an "object not set to a reference..." error.
Ignore my couple of outer curly braces; new to posting code and only way to get it all together in one block. I have handled the platform-specific (Android & iOS) code for obtaining the dbPath to the SqLite .db3 file (not shown here).
Cannot figure what I'm missing that no data will come back out of DB. Any help much appreciated!
{
public class DatabaseContext : DbContext
{
string _dbPath;
public DbSet<Cat> Cats { get; set; }
public DatabaseContext(string dbPath)
{
_dbPath = dbPath;
Database.EnsureCreatedAsync();
}
public async Task<IEnumerable<Cat>> GetCats()
{
var allCats = await Cats.ToListAsync().ConfigureAwait(false);
return allCats;
}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlite($"Filename={_dbPath}");
}
}
List<Cat> itemSource;
// Create Database & Tables
using (var db = new DatabaseContext(App.dbPath))
{
// Ensure database is created
db.Database.EnsureCreated();
// Insert Data
db.Add(new Cat() { IdCat = "111", Fname = "Felix1" });
db.SaveChanges();
// Retreive Data
//method 1
// RESULT: no data are in "itemsource", info reads "exception count = 1"
itemSource = db.Cats.ToList();
// method 2
// RESULTS: crashes with error "System.NullReferenceException: Object reference not set to an instance of an object."
Task<IEnumerable<Cat>> p = db.GetCats();
itemSource = db.Cats.ToList();
}
}

Proper way to write a spring-batch ItemReader

I'm constructing a spring-batch job that modifies a given number of records. The list of record ID's are an input parameter of the job. For example, one job might be: Modify the record Id's {1,2,3,4} and set parameters X and Y on related tables.
Since I'm unable to pass a potentialy very long input list (tipical cases, 50K records) to my ItemReader I only pass a MyJobID which then the itemReader uses to load the target ID list.
Problem is, the resulting code appears "wrong" (altough it works) and not in the spirit of spring-batch. Here's the reader:
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
#Component
public class MyItemReader implements ItemReader<Integer> {
#Autowired
private JobService jobService;
private List<Integer> itemsList;
private Long jobId;
#Autowired
public MyItemReader(#Value("#{jobParameters['jobId']}") final Long jobId) {
this.jobId = jobId;
this.itemsList = null;
}
#Override
public Integer read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
// First pass: Load the list.
if (itemsList == null) {
itemsList = new ArrayList<Integer>();
MyJob myJob = (MyJob) jobService.loadById(jobId);
for (Integer i : myJob.getTargedIdList()) {
itemsList.add(i);
}
}
// Serve one at a time:
if (itemsList.isEmpty()) {
return null;
} else {
return itemsList.remove(0);
}
}
}
I tried to move the first part of the read() method to the constructor but the #Autowired reference is null at that point. Afterwards (on the read method) it's initialized.
Is there a better way to write the ItemReader? I would like to move the "load"Or is this the best solution for this scenario?
Thank you.
Generally, your approach is not "wrong", but probably not ideal.
Firstly, you could move the initialisation to a initMethod which is annotated with #PostConstruct. This method is called after all Autowired fields have been injected:
#PostConstruct
public void afterPropertiesSet() throws Exception {
itemsList = new ArrayList<Integer>();
MyJob myJob = (MyJob) jobService.loadById(jobId);
for (Integer i : myJob.getTargedIdList()) {
itemsList.add(i);
}
}
But there is still the problem, that you load all the data at once. If you have a billion records to process, this could blow up the memory.
So what you should do is to load only a chunk of your data into memory, then return the items one by one in your read method. If all entries of a chunk have been returned, load the next chunk and return its items one by one again. If there is no other chunk to be loaded, then return null from the read method.
This ensures that you have a constant memory footprint regardless of how many records you have to process.
(If you have a look at FlatFileItemReader, you see that it uses a BufferedReader to read the data from the disk. While it has nothing to do with SpringBatch, it is the same principle: it reads a chunk of data from the disk, returns that and if more data is needed, it reads the next chunk of data).
Next problem is the restartability. What happens if the job crashes after doing 90% of the work? How can the job be restarted and only process the missing 10%?
This is actually a feature that springbatch provides, all you have to do is to implement the ItemStream interface and implement the methods open(), update(), close().
If you consider this two points - load data in chunks instead all at once and implement ItemStream interface - you'll end up having a reader that is in the spring spirit.

MongoDB Error - Can't get Player Data

I'm working on a Framework for a Minecraft Server and I'm running into a constant error while retrieving player data from a MongoDB database. I have both a proxy plugin and a spigot plugin that the framework is shaded into.
For the proxy plugin, I can get and store the player data in my PlayerCache (A map that assigns UUIDs to PlayerData objects), but however, for the spigot plugin, I cannot. It returns a null pointer, saying that the data is not found in the cache, even though I am sure I called the cacheData method.
I believe what is going on is that I'm not allowing enough time for the database to get the information and cache it.
Spigot code:
#EventHandler
public void onLogin(PlayerLoginEvent event)
{
PlayerCache.getInstance().cachePlayer(event.getPlayer().getUniqueId());
}
#EventHandler
public void onJoin(PlayerJoinEvent event)
{
event.setJoinMessage(null);
event.getPlayer().setDisplayName(PlayerCache.getInstance().getCachedRank(
event.getPlayer().getUniqueId()
).getColor() + event.getPlayer().getName()); //Throws a null pointer.
}
PlayerCache:
public void cachePlayer(UUID uuid)
{
PlayerData PD = PlayerManager.getInstance().getPlayerData(uuid);
this.cache.put(uuid, PD);
}
public PlayerRank getCachedRank(UUID uuid)
{
return this.cache.get(uuid).getRank();
}
Hope someone can help.

Ormlite and PostgreSQL - Error inserting text array with custom persister

I have been working to setup Ormlite as the primary data access layer between a PostgreSQL database and Java application. Everything has been fairly straightforward, until I started messing with PostgreSQL's array types. In my case, I have two tables that make use of text[] array type. Following the documentation, I created a custom data persister as below:
public class StringArrayPersister extends StringType {
private static final StringArrayPersister singleTon = new StringArrayPersister();
private StringArrayPersister() {
super(SqlType.STRING, new Class<?>[]{String[].class});
}
public static StringArrayPersister getSingleton() {
return singleTon;
}
#Override
public Object javaToSqlArg(FieldType fieldType, Object javaObject) {
String[] array = (String[]) javaObject;
if (array == null) {
return null;
} else {
String join = "";
for (String str : array) {
join += str +",";
}
return "'{" + join.substring(0,join.length() - 1) + "}'";
}
}
#Override
public Object sqlArgToJava(FieldType fieldType, Object sqlArg, int columnPos) {
String string = (String) sqlArg;
if (string == null) {
return null;
} else {
return string.replaceAll("[{}]","").split(",");
}
}
}
And then in my business object implementation, I set up the persister class on the column likeso:
#DatabaseField(columnName = TAGS_FIELD, persisterClass = StringArrayPersister.class)
private String[] tags;
When ever I try inserting a new record with the Dao.create statement, I get an error message saying tags is of type text[], but got character varying... However, when querying existing records from the database, the business object (and text array) load just fine.
Any ideas?
UPDATE:
PostGresSQL 9.2. The exact error message:
Caused by: org.postgresql.util.PSQLException: ERROR: column "tags" is
of type text[] but expression is of type character varying Hint: You
will need to rewrite or cast the expression.
I've not used ormlite before (I generally use MyBatis), however, I believe the proximal issue is this code:
private StringArrayPersister() {
super(SqlType.STRING, new Class<?>[]{String[].class});
}
SqlType.String is mapped to varchar in SQL in the ormlite code, and so therefore I believe is the proximal cause of the error you're getting. See ormlite SQL Data Types info for more detail on that.
Try changing it to this:
private StringArrayPersister() {
super(SqlType.OTHER, new Class<?>[]{String[].class});
}
There may be other tweaks necessary as well to get it fully up and running, but that should get you passed this particular error with the varchar type mismatch.

DaoException: Entity is detached from DAO context

I have two entities, User and Store. User has many Stores (1:M) relation. I've inserted some stores list into the store table by following code.
public void saveStoresToDatabase(Context context, ArrayList<Store> storeList) {
DevOpenHelper helper = new DaoMaster.DevOpenHelper(context, "notes-db", null);
SQLiteDatabase db = helper.getWritableDatabase();
DaoMaster daoMaster = new DaoMaster(db);
DaoSession daoSession = daoMaster.newSession();
StoreDao storeDao = daoSession.getStoreDao();
ArrayList <Store> list = SharedData.getInstance().getUser().getStoreList();
for(int i = 0; i < storeList.size(); i++) {
storeList.get(i).setUserIdForStore(SharedData.getInstance().getUser().getId());
}
storeDao.insertOrReplaceInTx(storeList);
list.addAll(storeList);
user.resetStoreList();
}
I am getting "entity is detached from DAO context" exception whenever I try call user.getStoreList(). The exception occurs at following code sniped as the daoSession is null.
public ArrayList<Store> getDMStoreListFromDatabase(Context context) {
return SharedData.getInstance().getUser().getStoreList();
}
where SharedData is my singleton, having a user object:
private SharedData() {
user = new User();
}
and I get the sharedData instance as follow:
public static synchronized SharedData getInstance() {
if (sharedObject == null) {
sharedObject = new SharedData();
}
return sharedObject;
}
Objects representing database entries (like User) are only attached to a Database-session if they have been fetched from the database or inserted to the database before.
It looks like you don't load your user-object using greendao, but instead just create it with new.
You also seem not to store this user-object using the dao. Thus the user-object is not attached to the session.
On top of that you are also just setting the userid in each store. If you haven't inserted the user-object somewhere else this may also cause an error since the foreignkey-constraint may be broken (depending on how greendao handles this internally).
Try to add the user-object to the stores with setUser() instead of setUserIdForStore().
If this doesn't work try to store or load the user-object first using a UserDao.