GWT+HIBERNATE+GILEAD is not working - gwt

Hello the GWT+Hibernate+Gilead is not working i have downloaded the sample code from this link
https://developers.google.com/web-toolkit/articles/using_gwt_with_hibernate.
I don't know what is wrong i ma continuously getting this error
Parameter 0 of is of an unknown type 'com.google.musicstore.domain.Account/3669536145'
[java] 16 [btpool0-0] INFO org.hibernate.cfg.Environment - Hibernate 3.3.1.GA
[java] 20 [btpool0-0] INFO org.hibernate.cfg.Environment - hibernate.properties not found
[java] 24 [btpool0-0] INFO org.hibernate.cfg.Environment - Bytecode provider name : javassist
[java] 30 [btpool0-0] INFO org.hibernate.cfg.Environment - using JDK 1.4 java.sql.Timestamp handling
[java] 105 [btpool0-0] INFO org.hibernate.cfg.Configuration - configuring from resource: /hibernate.cfg.xml
[java] 105 [btpool0-0] INFO org.hibernate.cfg.Configuration - Configuration resource: /hibernate.cfg.xml
[java] 192 [btpool0-0] INFO org.hibernate.cfg.Configuration - Reading mappings from resource :com/google/musicstore/domain/Record.hbm.xml
[java] 286 [btpool0-0] INFO org.hibernate.cfg.HbmBinder - Mapping class: com.google.musicstore.domain.Record -> PUBLIC.RECORD
[java] 359 [btpool0-0] INFO org.hibernate.cfg.Configuration - Reading mappings from resource :com/google/musicstore/domain/Account.hbm.xml
[java] 391 [btpool0-0] INFO org.hibernate.cfg.HbmBinder - Mapping class: com.google.musicstore.domain.Account -> PUBLIC.ACCOUNT
[java] 468 [btpool0-0] INFO org.hibernate.cfg.HbmBinder - Mapping collection: com.google.musicstore.domain.Account.records -> PUBLIC.ACCOUNT_RECORD
[java] 490 [btpool0-0] INFO org.hibernate.cfg.Configuration - Configured SessionFactory: null
[java] 569 [btpool0-0] INFO org.hibernate.connection.DriverManagerConnectionProvider - Using Hibernate built-in connection pool (not for production use!)
[java] 569 [btpool0-0] INFO org.hibernate.connection.DriverManagerConnectionProvider - Hibernate connection pool size: 2
[java] 569 [btpool0-0] INFO org.hibernate.connection.DriverManagerConnectionProvider - autocommit mode: false
[java] 577 [btpool0-0] INFO org.hibernate.connection.DriverManagerConnectionProvider - using driver: org.hsqldb.jdbcDriver at URL: jdbc:hsqldb:hsql://localhost/
[java] 578 [btpool0-0] INFO org.hibernate.connection.DriverManagerConnectionProvider - connection properties: {user=sa, password=****}
[java] 866 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - RDBMS: HSQL Database Engine, version: 2.2.8
[java] 866 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - JDBC driver: HSQL Database Engine Driver, version: 2.2.8
[java] 886 [btpool0-0] INFO org.hibernate.dialect.Dialect - Using dialect: org.hibernate.dialect.HSQLDialect
[java] 899 [btpool0-0] INFO org.hibernate.transaction.TransactionFactoryFactory - Using default transaction strategy (direct JDBC transactions)
[java] 903 [btpool0-0] INFO org.hibernate.transaction.TransactionManagerLookupFactory - No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended)
[java] 903 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Automatic flush during beforeCompletion(): disabled
[java] 903 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Automatic session close at endof transaction: disabled
[java] 903 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - JDBC batch size: 15
[java] 903 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - JDBC batch updates for versioned data: disabled
[java] 904 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Scrollable result sets: enabled
[java] 904 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - JDBC3 getGeneratedKeys(): enabled
[java] 905 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Connection release mode: auto
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Default schema: PUBLIC
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Default batch fetch size: 1
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Generate SQL with comments: disabled
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Order SQL updates by primary key: disabled
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Order SQL inserts for batching: disabled
[java] 906 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory
[java] 910 [btpool0-0] INFO org.hibernate.hql.ast.ASTQueryTranslatorFactory - Using ASTQueryTranslatorFactory
[java] 910 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Query language substitutions: {}
[java] 910 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - JPA-QL strict compliance: disabled
[java] 910 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Second-level cache: enabled
[java] 910 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Query cache: disabled
[java] 944 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Cache region factory : org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge
[java] 945 [btpool0-0] INFO org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge -Cache provider: org.hibernate.cache.NoCacheProvider
[java] 945 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Optimize cache for minimal puts: disabled
[java] 945 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Structured second-level cache entries: disabled
[java] 951 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Echoing all SQL to stdout
[java] 952 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Statistics: disabled
[java] 952 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Deleted entity synthetic identifier rollback: disabled
[java] 952 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Default entity-mode: pojo
[java] 952 [btpool0-0] INFO org.hibernate.cfg.SettingsFactory - Named query checking : enabled
[java] 1067 [btpool0-0] INFO org.hibernate.impl.SessionFactoryImpl - building session factory
[java] 1500 [btpool0-0] INFO org.hibernate.impl.SessionFactoryObjectFactory - Not binding factory to JNDI, no JNDI name configured
[java] 1510 [btpool0-0] INFO org.hibernate.tool.hbm2ddl.SchemaExport - Running hbm2ddl schema export
[java] 1511 [btpool0-0] INFO org.hibernate.tool.hbm2ddl.SchemaExport - exporting generated schema to database
[java] GWT15
[java] 1583 [btpool0-0] INFO org.hibernate.tool.hbm2ddl.SchemaExport - schema export complete
Here is the jar file i am using
adapter4gwt-1.2.3.823.jar
adapter-core-1.2.3.823.jar
antlr-2.7.6.jar
asm-3.2.jar
beanlib-hibernate-3.2.1.jar
cglib-2.2.jar
commons-collections-3.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
dom4j-1.6.1.jar
hibernate-core-3.3.1.GA.jar
hibernate-jpa-2.0-api-1.0.0.Final.jar
hibernate-util-1.2.3.823.jar
hsqldb.jar
javassist-3.4.GA.jar
jboss-serialization.jar
jta-1.1.jar
log4j-1.2.15.jar
slf4j-api-1.5.2.jar
slf4j-simple-1.5.2.jar
trove-2.0.4.jar
Please please help me out of this , inform me where i am wrong,
Thanks in advance

Well, with "newer" versions of GWT (i.e. newer than 1.7 I think), you'll need to use a more recent version of Gilead. AFAIK, the problem you describe has been solved long ago. The last release was version 1.3.2 on 2010-05-22.
Yes, this release is old, and Gilead isn't updated anymore. However, it still works, even with the latest GWT versions (currently 2.5 rc), as long as you're using Hibernate <= 3.5.x.
I wouldn't really recommend starting a project with Gilead now, see http://sourceforge.net/projects/gilead/forums/forum/868076/topic/4525959

Related

No suitable driver found for jdbc:pstgresql://localhost:5432/hibernatedb

I am getting this error. Tried two jar files into src folder, project folder etc too. Please check and help.
Tried Hibernate 3.6.4 and 5.4.2 both in different projects.
294 [main] INFO org.hibernate.cfg.Environment - Hibernate 3.6.4.Final
297 [main] INFO org.hibernate.cfg.Environment - hibernate.properties not found
303 [main] INFO org.hibernate.cfg.Environment - Bytecode provider name : javassist
311 [main] INFO org.hibernate.cfg.Environment - using JDK 1.4 java.sql.Timestamp handling
446 [main] INFO org.hibernate.cfg.Configuration - configuring from resource: /hibernate.cfg.xml
446 [main] INFO org.hibernate.cfg.Configuration - Configuration resource: /hibernate.cfg.xml
1330 [main] INFO org.hibernate.cfg.Configuration - Configured SessionFactory: null
1476 [main] INFO org.hibernate.cfg.AnnotationBinder - Binding entity from annotated class: org.SecondHibernateProject.dto.UserDetails
1550 [main] INFO org.hibernate.cfg.annotations.EntityBinder - Bind entity org.SecondHibernateProject.dto.UserDetails on table UserDetails
1629 [main] INFO org.hibernate.cfg.Configuration - Hibernate Validator not found: ignoring
1635 [main] INFO org.hibernate.cfg.search.HibernateSearchEventListenerRegister - Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled.
1647 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - Using Hibernate built-in connection pool (not for production use!)
1647 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - Hibernate connection pool size: 1
1647 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - autocommit mode: false
1678 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - using driver: org.postgresql.Driver at URL: jdbc:pstgresql://localhost:5432/hibernatedb
1678 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - connection properties: {user=admin, password=****}
1680 [main] WARN org.hibernate.cfg.SettingsFactory - Could not obtain connection to query metadata
java.sql.SQLException: No suitable driver found for jdbc:pstgresql://localhost:5432/hibernatedb
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.hibernate.connection.DriverManagerConnectionProvider.getConnection(DriverManagerConnectionProvider.java:133)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:113)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2863)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2859)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1870)
at org.vishal.hibernate.HibernateTest2.main(HibernateTest2.java:12)
1742 [main] INFO org.hibernate.dialect.Dialect - Using dialect: org.hibernate.dialect.PostgreSQLDialect
1760 [main] INFO org.hibernate.engine.jdbc.JdbcSupportLoader - Disabling contextual LOB creation as connection was null
1762 [main] INFO org.hibernate.transaction.TransactionFactoryFactory - Using default transaction strategy (direct JDBC transactions)
1764 [main] INFO org.hibernate.transaction.TransactionManagerLookupFactory - No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended)
1764 [main] INFO org.hibernate.cfg.SettingsFactory - Automatic flush during beforeCompletion(): disabled
1764 [main] INFO org.hibernate.cfg.SettingsFactory - Automatic session close at end of transaction: disabled
1765 [main] INFO org.hibernate.cfg.SettingsFactory - Scrollable result sets: disabled
1765 [main] INFO org.hibernate.cfg.SettingsFactory - JDBC3 getGeneratedKeys(): disabled
1765 [main] INFO org.hibernate.cfg.SettingsFactory - Connection release mode: auto
1766 [main] INFO org.hibernate.cfg.SettingsFactory - Default batch fetch size: 1
1767 [main] INFO org.hibernate.cfg.SettingsFactory - Generate SQL with comments: disabled
1767 [main] INFO org.hibernate.cfg.SettingsFactory - Order SQL updates by primary key: disabled
1767 [main] INFO org.hibernate.cfg.SettingsFactory - Order SQL inserts for batching: disabled
1767 [main] INFO org.hibernate.cfg.SettingsFactory - Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory
1770 [main] INFO org.hibernate.hql.ast.ASTQueryTranslatorFactory - Using ASTQueryTranslatorFactory
1771 [main] INFO org.hibernate.cfg.SettingsFactory - Query language substitutions: {}
1771 [main] INFO org.hibernate.cfg.SettingsFactory - JPA-QL strict compliance: disabled
1772 [main] INFO org.hibernate.cfg.SettingsFactory - Second-level cache: enabled
1772 [main] INFO org.hibernate.cfg.SettingsFactory - Query cache: disabled
1773 [main] INFO org.hibernate.cfg.SettingsFactory - Cache region factory : org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge
1777 [main] INFO org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge - Cache provider: org.hibernate.cache.NoCacheProvider
1778 [main] INFO org.hibernate.cfg.SettingsFactory - Optimize cache for minimal puts: disabled
1778 [main] INFO org.hibernate.cfg.SettingsFactory - Structured second-level cache entries: disabled
1784 [main] INFO org.hibernate.cfg.SettingsFactory - Echoing all SQL to stdout
1785 [main] INFO org.hibernate.cfg.SettingsFactory - Statistics: disabled
1785 [main] INFO org.hibernate.cfg.SettingsFactory - Deleted entity synthetic identifier rollback: disabled
1786 [main] INFO org.hibernate.cfg.SettingsFactory - Default entity-mode: pojo
1786 [main] INFO org.hibernate.cfg.SettingsFactory - Named query checking : enabled
1786 [main] INFO org.hibernate.cfg.SettingsFactory - Check Nullability in Core (should be disabled when Bean Validation is on): enabled
1868 [main] INFO org.hibernate.impl.SessionFactoryImpl - building session factory
1882 [main] INFO org.hibernate.type.BasicTypeRegistry - Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType#35fc6dc4
2572 [main] INFO org.hibernate.impl.SessionFactoryObjectFactory - Not binding factory to JNDI, no JNDI name configured
2584 [main] INFO org.hibernate.tool.hbm2ddl.SchemaExport - Running hbm2ddl schema export
2585 [main] INFO org.hibernate.tool.hbm2ddl.SchemaExport - exporting generated schema to database
2585 [main] ERROR org.hibernate.tool.hbm2ddl.SchemaExport - schema export unsuccessful
java.sql.SQLException: No suitable driver found for jdbc:pstgresql://localhost:5432/hibernatedb
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.hibernate.connection.DriverManagerConnectionProvider.getConnection(DriverManagerConnectionProvider.java:133)
at org.hibernate.tool.hbm2ddl.SuppliedConnectionProviderConnectionHelper.prepare(SuppliedConnectionProviderConnectionHelper.java:51)
at org.hibernate.tool.hbm2ddl.SchemaExport.execute(SchemaExport.java:263)
at org.hibernate.tool.hbm2ddl.SchemaExport.create(SchemaExport.java:219)
at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:372)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
at org.vishal.hibernate.HibernateTest2.main(HibernateTest2.java:12)
2659 [main] WARN org.hibernate.util.JDBCExceptionReporter - SQL Error: 0, SQLState: 08001
2659 [main] ERROR org.hibernate.util.JDBCExceptionReporter - No suitable driver found for jdbc:pstgresql://localhost:5432/hibernatedb
Exception in thread "main" org.hibernate.exception.JDBCConnectionException: Cannot open connection
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:99)
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:66)
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:52)
at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:449)
at org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:167)
at org.hibernate.jdbc.JDBCContext.connection(JDBCContext.java:160)
at org.hibernate.transaction.JDBCTransaction.begin(JDBCTransaction.java:81)
at org.hibernate.impl.SessionImpl.beginTransaction(SessionImpl.java:1473)
at org.vishal.hibernate.HibernateTest2.main(HibernateTest2.java:14)
Caused by: java.sql.SQLException: No suitable driver found for jdbc:pstgresql://localhost:5432/hibernatedb
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.hibernate.connection.DriverManagerConnectionProvider.getConnection(DriverManagerConnectionProvider.java:133)
at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:446)
... 5 more
My hibernate configuration file looks like:
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://hibernate.org/dtd/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<!-- Database Connection Settings
-->
<property name="connection.driver_class">org.postgresql.Driver</property>
<property name="connection.url">jdbc:pstgresql://localhost:5432/hibernatedb</property>
<property name="connection.username">admin</property>
<property name="connection.password">admin</property>
<!-- JDBC Connection pool (use the build-in) -->
<property name="connection.pool_size">1</property>
<!-- SQL dialect -->
<property name="dialect">org.hibernate.dialect.PostgreSQLDialect</property>
<!-- Disable Second level cache -->
<property name="cache.provider_class">org.hibernate.cache.NoCacheProvider</property>
<!-- Echo all executed SQL to stdout -->
<property name="show_sql">true</property>
<!-- Drop and re-create the database on startup -->
<property name="hbm2ddl.auto">create</property>
<!-- Names the annotated entity class -->
<mapping class="org.SecondHibernateProject.dto.UserDetails" />
</session-factory>
</hibernate-configuration>
And the Main Java Class is as below:
package org.vishal.hibernate;
import org.SecondHibernateProject.dto.UserDetails;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.cfg.Configuration;;
public class HibernateTest2 {
public static void main(String[] args) {
UserDetails user = new UserDetails();
user.setUserId(1);
user.setUserName("First User");
SessionFactory sessionFactory=new Configuration().configure().buildSessionFactory();
Session session= sessionFactory.openSession();
session.beginTransaction();
session.save(user);
session.getTransaction().commit();
}
}
Could you please refer the given steps to fix this issue.
To download the PostgreSQL jar from the given path as per version.
https://jdbc.postgresql.org/download.html
Add the jar to the project by using "external jar" option in eclipse.
Verify the classpath and URL path of PostgreSQL DB.
Make sure that the "hibernate.cfg.xml" file should be placed at the root of "src"
folder if you are using given code for reading this file.
SessionFactory sessionFactory = new Configuration().configure().buildSessionFactory();
If the "hibernate.cfg.xml" located at a different location then read this file from the given path.
SessionFactory sessionFactory = new Configuration().configure("/path/to/hibernate.cfg.xml").buildSessionFactory();

ERROR [main] (00000021-ImpEx-Import) [ImpExImportJob] Can not resolve any more lines ... Aborting further passes

[java] INFO [main] [ImpExSystemSetup] importing resource : /impex/essentialdata_mediaconversion_jobs.impex
[java] INFO [main] (00000021-ImpEx-Import) [ImpExImportJob] Starting multi-threaded ImpEx cronjob "ImpEx-Import" (8 threads)
[java] INFO [ImpExWorker<2/8>] [ImpExWorker] Returning worker impex worker 2/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<3/8>] [ImpExWorker] Returning worker impex worker 3/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<4/8>] [ImpExWorker] Returning worker impex worker 4/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<5/8>] [ImpExWorker] Returning worker impex worker 5/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<6/8>] [ImpExWorker] Returning worker impex worker 6/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<7/8>] [ImpExWorker] Returning worker impex worker 7/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<0/8>] [ImpExWorker] Returning worker impex worker 0/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [ImpExWorker<1/8>] [ImpExWorker] Returning worker impex worker 1/8 [cj:00000021-ImpEx-Import] to the pool
[java] INFO [impex result worker [cj:00000021-ImpEx-Import]] [ImpExWorker] Returning worker impex result worker [cj:00000021-ImpEx-Import] to the pool
[java] INFO [main] (00000021-ImpEx-Import) [Importer] Finished 1 pass in 0d 00h:00m:00s:027ms - processed: 4, dumped: 1 (last pass: 0)
[java] INFO [impex reader worker [cj:00000021-ImpEx-Import]] [ImpExWorker] Returning worker impex reader worker [cj:00000021-ImpEx-Import] to the pool
[java] INFO [main] (00000021-ImpEx-Import) [Importer] Starting pass 2
[java] INFO [ImpExWorker<0/8>] [ImpExWorker] Returning worker impex worker 0/8 [cj:00000021-ImpEx-Import] to the pool
[java] WARN [impex result worker [cj:00000021-ImpEx-Import]] [ImpExImportReader] line 2 at main script: dumped unresolved line ValueLine[unresolvable:null,line 2 at main script,,null,{1=ValueEntry('# no current header for value line'=null,unresolved=null,ignore=false)}]
[java] INFO [impex result worker [cj:00000021-ImpEx-Import]] [ImpExWorker] Returning worker impex result worker [cj:00000021-ImpEx-Import] to the pool
[java] INFO [main] (00000021-ImpEx-Import) [Importer] Finished 2 pass in 0d 00h:00m:00s:001ms - processed: 1, dumped: 1 (last pass: 1)
[java] INFO [impex reader worker [cj:00000021-ImpEx-Import]] [ImpExWorker] Returning worker impex reader worker [cj:00000021-ImpEx-Import] to the pool
[java] WARN [main] (00000021-ImpEx-Import) [Importer] Import aborted after 0d 00h:00m:00s:068ms
[java] ERROR [main] (00000021-ImpEx-Import) [ImpExImportJob] Can not resolve any more lines ... Aborting further passes (at pass 2). Finally could not import 1 lines!
[java] ERROR [main] [ImpExManager] Import has caused an error, see logs of cronjob with code=00000021-ImpEx-Import for further details
Pay attention to the warning
[ImpExImportReader] line 2 at main script: dumped unresolved line ValueLine[unresolvable:null,line 2 at main script,,null,{1=ValueEntry('# no current header for value line'=null,unresolved=null,ignore=false)}]
Impex header is not well-formed.
If you want a concrete solution, then provide original Impex text.

Storing HDFS data into MongoDB using Pig

I am new to Hadoop and having a requirement to store the Hadoop data into MongoDB. Here I am using Pig to store the data in Hadoop into MongoDB.
I downloaded and registered the following drivers to do this in Pig Grunt shell with the help of given command,
REGISTER /home/miracle/Downloads/mongo-hadoop-pig-2.0.2.jar
REGISTER /home/miracle/Downloads/mongo-java-driver-3.4.2.jar
REGISTER /home/miracle/Downloads/mongo-hadoop-core-2.0.2.jar
After this I successfully got the data from MongoDB using the following command.
raw = LOAD 'mongodb://localhost:27017/pavan.pavan.in' USING com.mongodb.hadoop.pig.MongoLoader;
Then I tried the following command to insert the data from pig bag to MongoDB and got succeeded.
STORE student INTO 'mongodb://localhost:27017/pavan.persons_info' USING com.mongodb.hadoop.pig.MongoInsertStorage('','');
Then I am trying the Mongo Update using the below command.
STORE student INTO 'mongodb://localhost:27017/pavan.persons_info1' USING com.mongodb.hadoop.pig.MongoUpdateStorage(' ','{first:"\$firstname", last:"\$lastname", phone:"\$phone", city:"\$city"}','firstname: chararray,lastname: chararray,phone: chararray,city: chararray');
But I am getting the below error while performing the above command.
2017-03-22 11:16:42,516 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-03-22 11:16:43,064 [main] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: ; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:43,180 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2017-03-22 11:16:43,306 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-03-22 11:16:43,308 [main] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2017-03-22 11:16:43,309 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-03-22 11:16:43,310 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-03-22 11:16:43,314 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-03-22 11:16:43,314 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-03-22 11:16:43,415 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-03-22 11:16:43,419 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-22 11:16:43,423 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-03-22 11:16:43,425 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-03-22 11:16:43,438 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - This job cannot be converted run in-process
2017-03-22 11:16:43,603 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/home/miracle/Downloads/mongo-java-driver-3.0.4.jar to DistributedCache through /tmp/temp159471787/tmp643027494/mongo-java-driver-3.0.4.jar
2017-03-22 11:16:43,687 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/home/miracle/Downloads/mongo-hadoop-core-2.0.2.jar to DistributedCache through /tmp/temp159471787/tmp-1745369112/mongo-hadoop-core-2.0.2.jar
2017-03-22 11:16:43,822 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/home/miracle/Downloads/mongo-hadoop-pig-2.0.2.jar to DistributedCache through /tmp/temp159471787/tmp116725398/mongo-hadoop-pig-2.0.2.jar
2017-03-22 11:16:44,693 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/usr/local/pig/pig-0.16.0/pig-0.16.0-core-h2.jar to DistributedCache through /tmp/temp159471787/tmp499355324/pig-0.16.0-core-h2.jar
2017-03-22 11:16:44,762 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/usr/local/pig/pig-0.16.0/lib/automaton-1.11-8.jar to DistributedCache through /tmp/temp159471787/tmp413788756/automaton-1.11-8.jar
2017-03-22 11:16:44,830 [DataStreamer for file /tmp/temp159471787/tmp-380031198/antlr-runtime-3.4.jar block BP-1303579226-127.0.1.1-1489750707340:blk_1073742392_1568] WARN org.apache.hadoop.hdfs.DFSClient - Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573)
2017-03-22 11:16:44,856 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/usr/local/pig/pig-0.16.0/lib/antlr-runtime-3.4.jar to DistributedCache through /tmp/temp159471787/tmp-380031198/antlr-runtime-3.4.jar
2017-03-22 11:16:44,960 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/usr/local/pig/pig-0.16.0/lib/joda-time-2.9.3.jar to DistributedCache through /tmp/temp159471787/tmp1163422388/joda-time-2.9.3.jar
2017-03-22 11:16:44,996 [main] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: ; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:45,004 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-03-22 11:16:45,147 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-03-22 11:16:45,166 [JobControl] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-22 11:16:45,253 [JobControl] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: ; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:45,318 [JobControl] WARN org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2017-03-22 11:16:45,572 [JobControl] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
2017-03-22 11:16:45,579 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2017-03-22 11:16:45,581 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2017-03-22 11:16:45,593 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2017-03-22 11:16:45,690 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2017-03-22 11:16:45,884 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_local1070093788_0006
2017-03-22 11:16:47,476 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606120/mongo-java-driver-3.0.4.jar <- /home/miracle/mongo-java-driver-3.0.4.jar
2017-03-22 11:16:47,534 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp643027494/mongo-java-driver-3.0.4.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606120/mongo-java-driver-3.0.4.jar
2017-03-22 11:16:47,534 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606121/mongo-hadoop-core-2.0.2.jar <- /home/miracle/mongo-hadoop-core-2.0.2.jar
2017-03-22 11:16:47,674 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp-1745369112/mongo-hadoop-core-2.0.2.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606121/mongo-hadoop-core-2.0.2.jar
2017-03-22 11:16:48,194 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606122/mongo-hadoop-pig-2.0.2.jar <- /home/miracle/mongo-hadoop-pig-2.0.2.jar
2017-03-22 11:16:48,201 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp116725398/mongo-hadoop-pig-2.0.2.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606122/mongo-hadoop-pig-2.0.2.jar
2017-03-22 11:16:48,329 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606123/pig-0.16.0-core-h2.jar <- /home/miracle/pig-0.16.0-core-h2.jar
2017-03-22 11:16:48,337 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp499355324/pig-0.16.0-core-h2.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606123/pig-0.16.0-core-h2.jar
2017-03-22 11:16:48,338 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606124/automaton-1.11-8.jar <- /home/miracle/automaton-1.11-8.jar
2017-03-22 11:16:48,370 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp413788756/automaton-1.11-8.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606124/automaton-1.11-8.jar
2017-03-22 11:16:48,371 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606125/antlr-runtime-3.4.jar <- /home/miracle/antlr-runtime-3.4.jar
2017-03-22 11:16:48,384 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp-380031198/antlr-runtime-3.4.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606125/antlr-runtime-3.4.jar
2017-03-22 11:16:48,389 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-miracle/mapred/local/1490206606126/joda-time-2.9.3.jar <- /home/miracle/joda-time-2.9.3.jar
2017-03-22 11:16:48,409 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://localhost:9000/tmp/temp159471787/tmp1163422388/joda-time-2.9.3.jar as file:/tmp/hadoop-miracle/mapred/local/1490206606126/joda-time-2.9.3.jar
2017-03-22 11:16:48,798 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606120/mongo-java-driver-3.0.4.jar
2017-03-22 11:16:48,803 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606121/mongo-hadoop-core-2.0.2.jar
2017-03-22 11:16:48,803 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606122/mongo-hadoop-pig-2.0.2.jar
2017-03-22 11:16:48,804 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606123/pig-0.16.0-core-h2.jar
2017-03-22 11:16:48,806 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606124/automaton-1.11-8.jar
2017-03-22 11:16:48,807 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606125/antlr-runtime-3.4.jar
2017-03-22 11:16:48,807 [JobControl] INFO org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/tmp/hadoop-miracle/mapred/local/1490206606126/joda-time-2.9.3.jar
2017-03-22 11:16:48,807 [JobControl] INFO org.apache.hadoop.mapreduce.Job - The url to track the job: http://localhost:8080/
2017-03-22 11:16:48,809 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_local1070093788_0006
2017-03-22 11:16:48,812 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases student1
2017-03-22 11:16:48,812 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: student1[7,11] C: R:
2017-03-22 11:16:48,889 [Thread-455] INFO org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter set in config null
2017-03-22 11:16:48,915 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-03-22 11:16:48,915 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Running jobs are [job_local1070093788_0006]
2017-03-22 11:16:48,999 [Thread-455] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: ; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:49,011 [Thread-455] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2017-03-22 11:16:49,013 [Thread-455] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2017-03-22 11:16:49,013 [Thread-455] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-03-22 11:16:49,054 [Thread-455] INFO org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter is org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter
2017-03-22 11:16:49,094 [Thread-455] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, file:/tmp/hadoop-miracle/mapred/local/localRunner/miracle/job_local1070093788_0006/job_local1070093788_0006.xml; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:49,104 [Thread-455] INFO com.mongodb.hadoop.output.MongoOutputCommitter - Setting up job.
2017-03-22 11:16:49,126 [Thread-455] INFO org.apache.hadoop.mapred.LocalJobRunner - Waiting for map tasks
2017-03-22 11:16:49,127 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local1070093788_0006_m_000000_0
2017-03-22 11:16:49,253 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, file:/tmp/hadoop-miracle/mapred/local/localRunner/miracle/job_local1070093788_0006/job_local1070093788_0006.xml; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:49,279 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, file:/tmp/hadoop-miracle/mapred/local/localRunner/miracle/job_local1070093788_0006/job_local1070093788_0006.xml; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:49,290 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.output.MongoOutputCommitter - Setting up task.
2017-03-22 11:16:49,296 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : [ ]
2017-03-22 11:16:49,340 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.MapTask - Processing split: Number of splits :1
Total Length = 212
Input split[0]:
Length = 212
ClassName: org.apache.hadoop.mapreduce.lib.input.FileSplit
Locations:
-----------------------
2017-03-22 11:16:49,415 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
2017-03-22 11:16:49,417 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader - Current split being processed hdfs://localhost:9000/input/student_dir/student_Info.txt:0+212
2017-03-22 11:16:49,459 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, file:/tmp/hadoop-miracle/mapred/local/localRunner/miracle/job_local1070093788_0006/job_local1070093788_0006.xml; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:49,684 [LocalJobRunner Map Task Executor #0] INFO org.mongodb.driver.cluster - Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500}
2017-03-22 11:16:50,484 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.output.MongoRecordWriter - Writing to temporary file: /tmp/hadoop-miracle/attempt_local1070093788_0006_m_000000_0/_MONGO_OUT_TEMP/_out
2017-03-22 11:16:50,516 [LocalJobRunner Map Task Executor #0] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Preparing to write to com.mongodb.hadoop.output.MongoRecordWriter#1fd6ae6
2017-03-22 11:16:50,736 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (Tenured Gen) of size 699072512 to monitor. collectionUsageThreshold = 489350752, usageThreshold = 489350752
2017-03-22 11:16:50,739 [LocalJobRunner Map Task Executor #0] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2017-03-22 11:16:50,746 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: student1[7,11] C: R:
2017-03-22 11:16:50,880 [Thread-455] INFO org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2017-03-22 11:16:50,919 [Thread-455] INFO com.mongodb.hadoop.pig.MongoUpdateStorage - Store location config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, file:/tmp/hadoop-miracle/mapred/local/localRunner/miracle/job_local1070093788_0006/job_local1070093788_0006.xml; for namespace: pavan.persons_info1; hosts: [localhost:27017]
2017-03-22 11:16:50,963 [Thread-455] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local1070093788_0006
java.lang.Exception: java.io.IOException: java.io.IOException: Couldn't convert tuple to bson:
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.io.IOException: java.io.IOException: Couldn't convert tuple to bson:
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.StoreFuncDecorator.putNext(StoreFuncDecorator.java:83)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:144)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:97)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMapOnly.java:48)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:261)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:65)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Couldn't convert tuple to bson:
at com.mongodb.hadoop.pig.MongoUpdateStorage.putNext(MongoUpdateStorage.java:165)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.StoreFuncDecorator.putNext(StoreFuncDecorator.java:75)
... 17 more
Caused by: java.lang.IndexOutOfBoundsException: Index: 1, Size: 1
at java.util.ArrayList.rangeCheck(ArrayList.java:653)
at java.util.ArrayList.get(ArrayList.java:429)
at org.apache.pig.data.DefaultTuple.get(DefaultTuple.java:117)
at com.mongodb.hadoop.pig.JSONPigReplace.substitute(JSONPigReplace.java:120)
at com.mongodb.hadoop.pig.MongoUpdateStorage.putNext(MongoUpdateStorage.java:142)
... 18 more
2017-03-22 11:16:53,944 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-03-22 11:16:53,944 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local1070093788_0006 has failed! Stop running all dependent jobs
2017-03-22 11:16:53,945 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-03-22 11:16:53,949 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-22 11:16:53,954 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-22 11:16:53,962 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-03-22 11:16:53,981 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.7.3 0.16.0 miracle 2017-03-22 11:16:43 2017-03-22 11:16:53 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_local1070093788_0006 student1 MAP_ONLY Message: Job failed! mongodb://localhost:27017/pavan.persons_info1,
Input(s):
Failed to read data from "hdfs://localhost:9000/input/student_dir/student_Info.txt"
Output(s):
Failed to produce result in "mongodb://localhost:27017/pavan.persons_info1"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_local1070093788_0006
2017-03-22 11:16:53,983 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-03-22 11:16:54,004 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1002: Unable to store alias student1
Details at logfile: /home/miracle/pig_1490205716326.log
Here is the input that I have dumped here,
Input(s):
Successfully read 6 records (5378419 bytes) from: "hdfs://localhost:9000/input/student_dir/student_Info.txt"
Output(s):
Successfully stored 6 records (5378449 bytes) in: "hdfs://localhost:9000/tmp/temp-1419179625/tmp882976412"
Counters:
Total records written : 6
Total bytes written : 5378449
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_local1866034015_0001
2017-03-23 02:43:37,677 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-23 02:43:37,681 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-23 02:43:37,689 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-03-23 02:43:37,736 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Success!
2017-03-23 02:43:37,748 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-03-23 02:43:37,751 [main] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2017-03-23 02:43:37,793 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2017-03-23 02:43:37,793 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
(Rajiv,Reddy,9848022337,Hyderabad)
(siddarth,Battacharya,9848022338,Kolkata)
(Rajesh,Khanna,9848022339,Delhi)
(Preethi,Agarwal,9848022330,Pune)
(Trupthi,Mohanthy,9848022336,Bhuwaneshwar)
(Archana,Mishra,9848022335,Chennai.)
I don't know What to do next please give me any suggestions on this.
I do not use Mongo , but from HBase experience and From the error it looks like few fields are not flatten 'ed enough for it to fit the column family. And also few of columns you are trying to STORE don't exist.
See to DUMP the data instead of STORE and check if it is matching the STORE structure you have applied.
Caused by: java.io.IOException: Couldn't convert tuple to bson: at
com.mongodb.hadoop.pig.MongoUpdateStorage.putNext(MongoUpdateStorage.java:165)
at
org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.StoreFuncDecorator.putNext(StoreFuncDecorator.java:75)
... 17 more Caused by: java.lang.IndexOutOfBoundsException: Index: 1,
Size: 1 at java.util.ArrayList.rangeCheck(ArrayList.java:653) at
java.util.ArrayList.get(ArrayList.java:429) at
org.apache.pig.data.DefaultTuple.get(DefaultTuple.java:117) at
com.mongodb.hadoop.pig.JSONPigReplace.substitute(JSONPigReplace.java:120)
at
com.mongodb.hadoop.pig.MongoUpdateStorage.putNext(MongoUpdateStorage.java:142)
... 18 more

Eclipse / Sonar Analysis - java.lang.OutOfMemoryError: PermGen space

Good morning,
i am trying to syncronize my Eclipse project with the Sonar server we have in our Company. I tried with 2 completely different computers/environments but cannot find a solution, even if I tried to apply every suggested tip found here (stackOverflow) or elsewhere.
It seems (as clearly explained by the stacktrace) that I may have a memory lack and, since the project is really huge, it may be obvious, but pheraps I cannot (or I don't know) how to launch eclipse with more memory.
I modified the eclipse.ini as below:
-startup
plugins/org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.win32.win32.x86_64_1.1.200.v20140603-1326
-product
org.eclipse.epp.package.standard.product
--launcher.defaultAction
openFile
--launcher.XXMaxPermSize
256M
-showsplash
org.eclipse.platform
--launcher.XXMaxPermSize
1024m
--launcher.defaultAction
openFile
--launcher.appendVmargs
-vmargs
-Dosgi.requiredJavaVersion=1.7
-Xms512m
-Xmx2048m
-XX:MaxPermSize=2048m
but nothing.
Later modified the Windows / Preferences / SonarQube / JVM arguments for analysis with:
-Xmx2048m
Tried later Windows / Preferences / SonarQube / Preview Analysis Properties with:
wrapper.java.maxmemory = 6000
but nothing. Eclipse cannot finish analysis (maybe just cannot store data in local db).
These are the environment specs (of the best machine...):
Windows 8 64bit - 8GB RAM
java version "1.7.0"
Java(TM) SE Runtime Environment (build 1.7.0-b147)
Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
Eclipse Standard/SDK
Version: Luna Release (4.4.0)
Build id: 20140612-0600
Retrieve remote issues of project PREPRODUZIONE...
Start SonarQube analysis on PREPRODUZIONE...
INFO: SonarQube Server 4.3
11:09:32.970 INFO - Incremental mode
11:09:32.972 INFO - Load batch settings
11:09:33.196 INFO - User cache: C:\Users\alex\.sonar\cache
11:09:33.201 INFO - Install plugins
11:09:33.235 INFO - Include plugins:
11:09:33.235 INFO - Exclude plugins: devcockpit, jira, pdfreport, views, report, buildstability, scmactivity, buildbreaker
11:09:41.588 INFO - Create JDBC datasource for jdbc:h2:C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\.sonartmp\preview1406279373883-0
11:09:42.716 INFO - Initializing Hibernate
11:09:45.471 INFO - Load project settings
11:09:45.553 INFO - Apply project exclusions
11:09:45.653 INFO - ------------- Scan PREPRODUZIONE
11:09:45.657 INFO - Load module settings
11:09:46.788 INFO - Loading technical debt model...
11:09:46.794 INFO - Loading technical debt model done: 6 ms
11:09:46.794 INFO - Loading rules...
11:09:47.056 INFO - Loading rules done: 262 ms
11:09:47.065 INFO - Configure Maven plugins
11:09:47.127 INFO - Compare to previous analysis (2014-07-10)
11:09:47.133 INFO - Compare over 30 days (2014-06-25, analysis of 2014-07-10 00:13:19.304)
11:09:47.140 INFO - Compare to previous version (2014-07-10)
11:09:47.425 INFO - Loaded quality gate 'RGI Standard - java '
11:09:47.824 INFO - Base dir: C:\ECLIPSE\3.3.2\PREPRODUZIONE
11:09:47.824 INFO - Working dir: C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core
11:09:47.824 INFO - Source dirs: C:\ECLIPSE\3.3.2\PREPRODUZIONE\src\java
11:09:47.824 INFO - Binary dirs: C:\ECLIPSE\3.3.2\PREPRODUZIONE\bin
11:09:47.824 INFO - Source encoding: windows-1252, default locale: it_IT
11:09:47.825 INFO - Index files
11:09:48.879 INFO - 1095 files indexed
11:09:53.579 INFO - Quality profile for java: RGI Standard
11:09:53.579 INFO - Quality profile for xml: Sonar way
11:09:53.898 INFO - JaCoCo report not found.
11:09:53.899 INFO - JaCoCo IT report not found.
11:09:53.899 INFO - JaCoCo reports not found.
11:09:53.915 INFO - Sensor JavaSquidSensor...
11:09:54.088 INFO - Java Main Files AST scan...
11:09:54.091 INFO - 1091 source files to be analyzed
11:10:04.092 INFO - 226/1091 files analyzed, current is C:\ECLIPSE\3.3.2\PREPRODUZIONE\src\java\it\boh\dbo\Vehicle.java
11:10:13.450 INFO - 1091/1091 source files analyzed
11:10:13.587 INFO - Java Main Files AST scan done: 19499 ms
11:10:13.624 INFO - Java bytecode scan...
11:10:14.759 INFO - Java bytecode scan done: 1135 ms
11:10:14.759 INFO - Java Test Files AST scan...
11:10:14.759 INFO - 0 source files to be analyzed
11:10:14.759 INFO - Java Test Files AST scan done: 0 ms
11:10:14.760 INFO - 0/0 source files analyzed
11:10:16.066 INFO - Sensor JavaSquidSensor done: 22151 ms
11:10:16.066 INFO - Sensor QProfileSensor...
11:10:16.067 INFO - Sensor QProfileSensor done: 1 ms
11:10:16.067 INFO - Sensor FindbugsSensor...
11:10:16.068 INFO - Execute Findbugs 2.0.3...
11:10:17.667 INFO - Found findbugs plugin: /C:/Users/alex/.sonar/cache/108e4763f8b1f4a0e9de5d1fa2aba85b/sonar-fb-contrib-plugin-1.2.jar_unzip/META-INF/lib/fb-contrib-4.8.5.jar
11:10:17.827 INFO - Findbugs output report: C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\findbugs-result.xml
Frame type field is null:false
11:11:20.205 INFO - Execute Findbugs 2.0.3 done: 64137 ms
11:11:20.287 INFO - Sensor FindbugsSensor done: 64220 ms
11:11:20.287 INFO - Sensor CpdSensor...
11:11:20.288 INFO - SonarEngine is used for java
11:11:20.294 INFO - Cross-project analysis disabled
11:11:23.923 INFO - Sensor CpdSensor done: 3636 ms
11:11:23.923 INFO - Sensor PmdSensor...
11:11:23.924 INFO - Execute PMD 5.1.1...
11:11:24.017 INFO - Java version: 1.5
11:11:24.095 INFO - PMD configuration: C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\pmd.xml
11:11:38.763 INFO - Execute PMD 5.1.1 done: 14839 ms
11:11:38.871 INFO - Sensor PmdSensor done: 14948 ms
11:11:38.871 INFO - Sensor XmlSensor...
11:11:39.069 INFO - Sensor XmlSensor done: 198 ms
11:11:39.070 INFO - Sensor LineCountSensor...
11:11:39.085 INFO - Sensor LineCountSensor done: 15 ms
11:11:39.086 INFO - Sensor SurefireSensor...
11:11:39.087 INFO - parsing C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\build\surefire-reports
11:11:39.087 WARN - Reports path not found: C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\build\surefire-reports
11:11:39.087 INFO - Sensor SurefireSensor done: 1 ms
11:11:39.087 INFO - Sensor CheckstyleSensor...
11:11:39.088 INFO - Execute Checkstyle 5.6...
11:11:39.161 INFO - Checkstyle configuration: C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\checkstyle.xml
11:11:54.484 INFO - Execute Checkstyle 5.6 done: 15396 ms
11:11:54.484 INFO - Sensor CheckstyleSensor done: 15397 ms
11:11:54.484 INFO - Sensor InitialOpenIssuesSensor...
11:11:54.733 INFO - Sensor InitialOpenIssuesSensor done: 249 ms
11:11:54.734 INFO - Sensor ProfileEventsSensor...
11:11:54.745 INFO - Sensor ProfileEventsSensor done: 11 ms
11:11:54.745 INFO - Sensor ProjectLinksSensor...
11:11:54.751 INFO - Sensor ProjectLinksSensor done: 6 ms
11:11:55.040 INFO - Execute decorators...
11:13:35.682 INFO - Export results to C:\ECLIPSE\3.3.2\.metadata\.plugins\org.eclipse.core.resources\.projects\PREPRODUZIONE\org.sonar.ide.eclipse.core\sonar-report.json
11:13:36.629 INFO - Store results in database
Exception in thread "main" org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.impl.BatchLauncherMain.execute(BatchLauncherMain.java:41)
at org.sonar.runner.impl.BatchLauncherMain.main(BatchLauncherMain.java:59)
Caused by: java.lang.OutOfMemoryError: PermGen space
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at org.sonar.runner.impl.IsolatedClassloader.loadClass(IsolatedClassloader.java:94)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at org.hibernate.hql.ast.SqlASTFactory.class$(SqlASTFactory.java:101)
at org.hibernate.hql.ast.SqlASTFactory.getASTNodeType(SqlASTFactory.java:103)
at antlr.ASTFactory.create(ASTFactory.java:152)
at antlr.ASTFactory.create(ASTFactory.java:186)
at org.hibernate.hql.antlr.HqlSqlBaseWalker.updateStatement(HqlSqlBaseWalker.java:327)
at org.hibernate.hql.antlr.HqlSqlBaseWalker.statement(HqlSqlBaseWalker.java:239)
at org.hibernate.hql.ast.QueryTranslatorImpl.analyze(QueryTranslatorImpl.java:254)
at org.hibernate.hql.ast.QueryTranslatorImpl.doCompile(QueryTranslatorImpl.java:185)
at org.hibernate.hql.ast.QueryTranslatorImpl.compile(QueryTranslatorImpl.java:136)
at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:101)
at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:80)
at org.hibernate.engine.query.QueryPlanCache.getHQLQueryPlan(QueryPlanCache.java:94)
at org.hibernate.impl.AbstractSessionImpl.getHQLQueryPlan(AbstractSessionImpl.java:156)
at org.hibernate.impl.AbstractSessionImpl.createQuery(AbstractSessionImpl.java:135)
at org.hibernate.impl.SessionImpl.createQuery(SessionImpl.java:1651)
at org.hibernate.ejb.AbstractEntityManagerImpl.createQuery(AbstractEntityManagerImpl.java:93)
at org.sonar.jpa.session.JpaDatabaseSession.createQuery(JpaDatabaseSession.java:184)
at org.sonar.batch.phases.UpdateStatusJob.setFlags(UpdateStatusJob.java:133)
at org.sonar.batch.phases.UpdateStatusJob.disablePreviousSnapshot(UpdateStatusJob.java:95)
Any other suggestions?
Thanks
Alessandro
Try going to Window / Preferences / SonarQube and enter your maxPermSize value in the "JVM arguments for preview analysis" box:
Set the below environment variable and the try to run the sonar runner
Variable Name: SONAR_RUNNER_OPTS
Variable Value: -Xmx512m -XX:MaxPermSize=128m

Sqoop installation export and import from postgresql

I v'e just installed sqoop and was testing it . I tried to export some data from hdfs to postgresql using sqoop. When I run it it throws the following exception : java.io.IOException: Can't export data, please check task tracker logs . I think there may also have been a problem in installation.
The File content is :
ustNU 45
MB1bA 0
gNbCO 76
iZP10 39
B2aoo 45
SI7eG 93
5sC4k 60
2IhFV 2
u2A48 16
yvy6R 51
LNhsV 26
mZ2yn 65
80Gp3 43
Wk5Ag 85
VUfyp 93
P077j 94
f1Oj5 11
LxJkg 72
0H7NP 99
Dk406 25
g4KRp 76
Fw3U0 80
6LD59 1
07KHx 91
F1S88 72
Bnb0v 85
A2qM7 79
Z6cAt 81
0M3DO 23
m0s09 44
KIvwd 13
GNUD0 78
um93a 20
19bHv 75
4Of3s 75
5hFen 16
This is the posgres table:
Table "public.mysort"
Column | Type | Modifiers
--------+---------+-----------
name | text |
marks | integer |
The sqoop command is:
sqoop export --connect jdbc:postgresql://localhost/testdb --username akshay --password akshay --table mysort -m 1 --export-dir MySort/input
Followed by the error:
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
14/06/11 18:28:06 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/06/11 18:28:06 INFO manager.SqlManager: Using default fetchSize of 1000
14/06/11 18:28:06 INFO tool.CodeGenTool: Beginning code generation
14/06/11 18:28:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "mysort" AS t LIMIT 1
14/06/11 18:28:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hduser/compile/0402ad4b5cf7980040264af35de406cb/mysort.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/06/11 18:28:07 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hduser/compile/0402ad4b5cf7980040264af35de406cb/mysort.jar
14/06/11 18:28:07 INFO mapreduce.ExportJobBase: Beginning export of mysort
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hbase/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/06/11 18:28:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/06/11 18:28:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/06/11 18:28:23 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
14/06/11 18:28:23 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
14/06/11 18:28:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/06/11 18:28:23 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/06/11 18:28:24 INFO input.FileInputFormat: Total input paths to process : 1
14/06/11 18:28:24 INFO input.FileInputFormat: Total input paths to process : 1
14/06/11 18:28:25 INFO mapreduce.JobSubmitter: number of splits:1
14/06/11 18:28:25 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1402488523460_0003
14/06/11 18:28:25 INFO impl.YarnClientImpl: Submitted application application_1402488523460_0003
14/06/11 18:28:25 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1402488523460_0003/
14/06/11 18:28:25 INFO mapreduce.Job: Running job: job_1402488523460_0003
14/06/11 18:28:46 INFO mapreduce.Job: Job job_1402488523460_0003 running in uber mode : false
14/06/11 18:28:46 INFO mapreduce.Job: map 0% reduce 0%
14/06/11 18:29:04 INFO mapreduce.Job: Task Id : attempt_1402488523460_0003_m_000000_0, Status : FAILED
Error: java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:839)
at mysort.__loadFromFields(mysort.java:198)
at mysort.parse(mysort.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
14/06/11 18:29:23 INFO mapreduce.Job: Task Id : attempt_1402488523460_0003_m_000000_1, Status : FAILED
Error: java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:839)
at mysort.__loadFromFields(mysort.java:198)
at mysort.parse(mysort.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
14/06/11 18:29:42 INFO mapreduce.Job: Task Id : attempt_1402488523460_0003_m_000000_2, Status : FAILED
Error: java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:839)
at mysort.__loadFromFields(mysort.java:198)
at mysort.parse(mysort.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
14/06/11 18:30:03 INFO mapreduce.Job: map 100% reduce 0%
14/06/11 18:30:03 INFO mapreduce.Job: Job job_1402488523460_0003 failed with state FAILED due to: Task failed task_1402488523460_0003_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
14/06/11 18:30:03 INFO mapreduce.Job: Counters: 9
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=3
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=69336
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=69336
Total vcore-seconds taken by all map tasks=69336
Total megabyte-seconds taken by all map tasks=71000064
14/06/11 18:30:03 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
14/06/11 18:30:03 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 100.1476 seconds (0 bytes/sec)
14/06/11 18:30:03 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
14/06/11 18:30:03 INFO mapreduce.ExportJobBase: Exported 0 records.
14/06/11 18:30:03 ERROR tool.ExportTool: Error during export: Export job failed!
This is the log file :
2014-06-11 17:54:37,601 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
2014-06-11 17:54:37,602 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
2014-06-11 17:54:52,678 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-06-11 17:54:52,777 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2014-06-11 17:54:52,846 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2014-06-11 17:54:52,847 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
2014-06-11 17:54:52,855 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
2014-06-11 17:54:52,855 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1402488523460_0002, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier#971d0d8)
2014-06-11 17:54:52,901 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
2014-06-11 17:54:53,165 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /tmp/hadoop-hduser/nm-local-dir/usercache/hduser/appcache/application_1402488523460_0002
2014-06-11 17:54:53,249 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
2014-06-11 17:54:53,249 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
2014-06-11 17:54:53,393 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2014-06-11 17:54:53,689 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
2014-06-11 17:54:53,899 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: Paths:/user/hduser/MySort/input/data.txt:0+891082
2014-06-11 17:54:53,904 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.file is deprecated. Instead, use mapreduce.map.input.file
2014-06-11 17:54:53,904 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.start is deprecated. Instead, use mapreduce.map.input.start
2014-06-11 17:54:53,904 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.length is deprecated. Instead, use mapreduce.map.input.length
2014-06-11 17:54:54,028 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2014-06-11 17:54:54,028 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception raised during data export
2014-06-11 17:54:54,028 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2014-06-11 17:54:54,028 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception:
java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:839)
at mysort.__loadFromFields(mysort.java:198)
at mysort.parse(mysort.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
2014-06-11 17:54:54,030 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input: ustNU 45
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input file: hdfs://localhost:9000/user/hduser/MySort/input/data.txt
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: At position 0
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Currently processing split:
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Paths:/user/hduser/MySort/input/data.txt:0+891082
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: This issue might not necessarily be caused by current input
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: due to the batching nature of export.
2014-06-11 17:54:54,031 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2014-06-11 17:54:54,032 INFO [Thread-12] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
2014-06-11 17:54:54,033 WARN [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hduser (auth:SIMPLE) cause:java.io.IOException: Can't export data, please check task tracker logs
2014-06-11 17:54:54,033 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:839)
at mysort.__loadFromFields(mysort.java:198)
at mysort.parse(mysort.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
2014-06-11 17:54:54,037 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task
Any help in resolving the issue is appreciated.
Here is the complete procedure for installation and import and export commands for Sqoop. Hope fully it may be helpful to some one. This one is tried and tested by me and actually works.
Download : apache.mirrors.tds.net/sqoop/1.4.4/sqoop-1.4.4.bin__hadoop-2.0.4-alpha.tar.gz
sudo mv sqoop-1.4.4.bin__hadoop-2.0.4-alpha.tar.gz /usr/lib/sqoop
copy paste followingtwo lines in .bashrc
export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin
Go to /usr/lib/sqoop/conf folder and copy sqoop-env-template.sh to new file sqoop-env.sh and modify export HADOOP_HOME ,HBASE_HOME,etc to the installation directory
Download the postgresql conector jar file from jdbc.postgresql.org/download/postgresql-9.3-1101.jdbc41.jar
create a directory manager.d in sqoop/conf/
create a file postgresql in conf/ and add the following line in it
org.postgresql.Driver=/usr/lib/sqoop/lib/postgresql-9.3-1101.jdbc41.jar
name the connector.jar file accordingly
For Export
Create a user in postgres:
createuser -P -s -e ace
Enter password for new role: ace
Enter it again: ace
CREATE DATABASE testdb OWNER ace TABLESPACE ace;
create table stud1(id int,name text);
Create a file student.txt
Add lines such as:
1,Ace
2,iloveapis
hadoop fs -put student.txt
sqoop export --connect jdbc:postgresql://localhost:5432/testdb --username ace --password ace --table stud1 -m 1 --export-dir student.txt
check in postgres: Select * from stud1;
For Import:
sqoop import --connect jdbc:postgresql://localhost:5432/testdb --username akshay --password akshay --table stud1 --m 1
hadoop fs -ls -R stud1
Expected Output:
-rw-r--r-- 1 hduser supergroup 0 2014-06-13 18:10 stud1/_SUCCESS
-rw-r--r-- 1 hduser supergroup 21 2014-06-13 18:10 stud1/part-m-00000
hadoop fs -cat stud1/part-m-00000
Expected Output:
1,Ace
2,iloveapis
hadoop fs -copyToLocal stud1/part-m-00000 $HOME/imported_data.txt