Import a single pdf file to Dspace 6.X on Centos 7 - import

I try to import a new single pdf file to Dspace 6.3 in Centos 7 after successful backup files from batch import (ZIP), but in xmlui page get several errors:
When i try import, error after accept license: Error while attempting to create identifier for Item id: 0b85c570-ab76-483a-9cdc-3db7b26716cb
When i try create new community: ERROR: duplicate key value violates unique constraint "handle_handle_key" Detail: Key (handle)=(123456789/41) already exists.
When i try create new collection: ERROR: duplicate key value violates unique constraint "handle_handle_key" Detail: Key (handle)=(123456789/42) already exists.
How i can fix these problems? Before backup i easily imported new pdf files, but after backup i get these problems

Related

Liquibase Validation Failed Exception - change sets check sum

After launch the application, I get an error:
Caused by: liquibase.exception.ValidationFailedException: Validation Failed: 1 change sets checksum was: 8:2b2936713e8d9aea052c3122fd81faec but now it is:
8:ed8f7550fdd9809f4f6bf0f2d83dbbd8
The error points to such a table:
create table car (
id bigint not null auto_increment PRIMARY KEY,
name varchar(255) not null,
category varchar(255) not null
);
I read about this error and it was pointed out to use the mvn liquibase:clearCheckSums command but in the terminal I get the error: Error: -classpath requires class path specification (I run the command in the project folder)
The error indicates that a changelog that already run now is running with some changes. If you change something on an already run changelog then the checksum changes and this error is completely normal.
Check your databasechangelog table for the mentioned checksum to verify the changelog that crashes. The solutions are usually not to change the checksum of an already run changelog. If you can drop this changelog from the database and run it again will work fine. Sometimes the same changelog crashes without changing anything. This happening to me all the time because of the line seperator. For example, on IntleliJ you can change it from here:
Check what line separator do you need by testing :)

Why is liquibase deleting databasechangelog rows and trying to create a renamed database table?

I am using postgres 10.5 and liquibase 3.6.2 on a Mac.
I nuke & re-create my database, run liquibase update, and it works.
But a second liquibase update fails with an exception that the pkey already exists.
After the first liquibase update, the databasechangelog table contains 97 entries. After the second, it contains 10, and the time and deployment ids for those are different than they were after the first update!
Table foo was created in an early change.
Later it was changed to be named bar, but the pkey is still foo.pkey.
Liquibase-update should not be trying to re-create foo, but it does, and fails because foo.pkey already exists.
A) In general, how can I get liquibase to output more info about what it's doing? I tried both of the commands:
liquibase --logLevel=debug --logFile=`pwd`/foo.log update
liquibase --logLevel debug --logFile `pwd`/foo.log update
Both seem to work the same, and foo.log isn't created and there's no more output in the terminal.
B) How can I stop liquibase from trying to re-make this and nuking my databasechangelog?
I tried to make a small example that fails, but this seems to work... Others here are using it with postgres 9.5.10 with no problem...
All I see in the terminal is:
Starting Liquibase at Wed, 14 Nov 2018 13:06:44 PST (version 3.6.2 built at 2018-07-03 11:28:09)
Unexpected error running Liquibase: ERROR: relation "cant_change_pkey" already exists [Failed SQL: CREATE TABLE nuss.cant_change (message_id UUID NOT NULL, origin VARCHAR(4), type VARCHAR(12) NOT NULL, CONSTRAINT CANT_CHANGE_PKEY PRIMARY KEY (message_id), UNIQUE (message_id))]
liquibase.exception.MigrationFailedException: Migration failed for change set db/changelog/changelog-new1.xml::first-one::rstrauss:
Reason: liquibase.exception.DatabaseException: ERROR: relation "cant_change_pkey" already exists [Failed SQL: CREATE TABLE nuss.cant_change (message_id UUID NOT NULL, origin VARCHAR(4), type VARCHAR(12) NOT NULL, CONSTRAINT CANT_CHANGE_PKEY PRIMARY KEY (message_id), UNIQUE (message_id))]
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:637)
at liquibase.changelog.visitor.UpdateVisitor.visit(UpdateVisitor.java:53)
at liquibase.changelog.ChangeLogIterator.run(ChangeLogIterator.java:78)
at liquibase.Liquibase.update(Liquibase.java:202)
at liquibase.Liquibase.update(Liquibase.java:179)
at liquibase.integration.commandline.Main.doMigration(Main.java:1205)
at liquibase.integration.commandline.Main.run(Main.java:191)
at liquibase.integration.commandline.Main.main(Main.java:129)
Caused by: liquibase.exception.DatabaseException: ERROR: relation "cant_change_pkey" already exists [Failed SQL: CREATE TABLE nuss.cant_change (message_id UUID NOT NULL, origin VARCHAR(4), type VARCHAR(12) NOT NULL, CONSTRAINT CANT_CHANGE_PKEY PRIMARY KEY (message_id), UNIQUE (message_id))]
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:356)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:57)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:125)
at liquibase.database.AbstractJdbcDatabase.execute(AbstractJdbcDatabase.java:1229)
at liquibase.database.AbstractJdbcDatabase.executeStatements(AbstractJdbcDatabase.java:1211)
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:600)
... 7 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: relation "cant_change_pkey" already exists
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2476)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2189)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:300)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:428)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:354)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:301)
at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:287)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:264)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:260)
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:352)
... 12 common frames omitted
For more information, please use the --logLevel flag

Typo3 Upgrade 7.6.31 to 8.7.19 / Database Analyzer fails

during the upgrade process from 7.6 to 8.7 the database Analyzer fails in the following statement:
Error:
Database update failed
Error: Specified key was too long; max key length is 1000 bytes
can you help?
i found the solution.
drop index lookup_string on sys_refindex;
then you can run database analyzer in to steps:
ALTER TABLE ....
CREATE INDEX ....
but this is done by database analyzer in upgrade wizard

postgres : column "id" violates not-null constraint\n Detail: Failing row contains

Hi i'm new with java web develpment
i wrote a simple play framework app with java
i want to save some contact in a ebean database it works well in my pc
i deployed my app in heroku and i'm using postgres plugin
the problem is when i POST the following request i get that error
req:/contacts/post?name=name&email=email%40mail.com&phone=234234
err:
play.api.UnexpectedException: Unexpected exception[PersistenceException: ERROR executing DML bindLog[] error[ERROR: null value in column "id" violates not-null constraint\n Detail: Failing row contains (null, name, 234234, email#mail.com).]]
but if i add &id=223434 it works well!!

MongoDB E11000 duplicate key error on mydb.testlookup.$name dup key:{:dummy123} in meanstack using angular-fullstack generator

Hi I am trying to create a simple project using angular-fullstack generator I have running my MongoDB and nodejs in windows, everything installed and running perfectly. I have created one schema as follows
'use strict';
var mongoose = require('mongoose'),
Schema = mongoose.Schema;
var TestlookupSchema = new Schema({
name: String,
ccode: String,
description: String,
info: String,
active: Boolean
});
module.exports = mongoose.model('Testlookup', TestlookupSchema);
I didn't touch any other default schemas that comes along with generator demo app.
whenever I am inserting any record into this collection I am getting below error
E11000 duplicate key error index: mydb.testlookup.$name dup key: {:dummy123}
I am using windows 7 as operating system
NodeJS 4.xx
Mongodb 3.x
What might be causing this error?
I got same problem but I resolved it by deleting index. Actually when generate angular-fullstack app it will create Thing schema there will be name field and when you create another schema which has same name field, so it will create Index. If you enter same data like in your case "dummy123" in name field for both schemas it will give duplicate key entry index error - E11000
Solution for this if you are in windows
Gotto Mongo Shell
command prompt - mongo.exe
use mydb
db.mydb.getIndexes()
You will find name as index just drop and recreate it
db.mydb.dropIndex( "name")
Now you restart your node app using grunt serve you wont get that problem again