Rails test:models dump after having added an index - postgresql

I'm developing a ruby application using Ruby 2.6.3, Rails 6.0.2.2 and Postgres 12.
Everything was running fine, until I added an index to a table using a migration:
class AddIndexToUserEmail < ActiveRecord::Migration[6.0]
def change
add_index :users, :email, unique: true
end
end
Running rails test:models return the following infinite dump, and I wasn't able to get rid of.
Any help will be appreciated
% rails test:models
Run options: --seed 62890
# Running:
Traceback (most recent call last):
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump': no _dump_data is defined for class PG::Connection (TypeError)
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:564:in `dump'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:567:in `rescue in dump'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `make_proxy'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `new'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1101:in `initialize'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1807:in `to_id'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1738:in `current_server': DRb::DRbServerNotFound (DRb::DRbServerNotFound)
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:564:in `dump'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:567:in `rescue in dump'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `make_proxy'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `new'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1101:in `initialize'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1807:in `to_id'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1738:in `current_server': DRb::DRbServerNotFound (DRb::DRbConnError)
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:94:in `block (2 levels) in start'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:97:in `rescue in block (2 levels) in start'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:97:in `each'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:98:in `block (3 levels) in start': undefined method `exception=' for #<Minitest::UnexpectedError: Unexpected exception> (NoMethodError)
Did you mean? exception
Traceback (most recent call last):
Traceback (most recent call last):
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump' 10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.: 0/drb/drb.rb:1158:in `with_friend'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
no _dump_data is defined for class PG::Connection ( 8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
TypeError 7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
) 5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:565:in `dump': 23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
no _dump_data is defined for class PG::Connection ( 22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
TypeError 21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
)
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:564:in `dump'
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:567:in `rescue in dump'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `make_proxy'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/ 17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
2.6.0/drb/drb.rb:650:in `new'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1101:in `initialize'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1807:in `to_id'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1738:in `current_server': DRb::DRbServerNotFound 9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
( 8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
DRb::DRbServerNotFound 7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
) 6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:564:in `dump'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:567:in `rescue in dump'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `make_proxy'
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `new'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1101:in `initialize'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1807:in `to_id'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1738:in `current_server': 23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
DRb::DRbServerNotFound ( 22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
DRb::DRbServerNotFound 21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
) 20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
25: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
24: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
23: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
22: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelizat 7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
ion.rb:74:in `map'
21: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:564:in `dump'
20: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
5: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:567:in `rescue in dump'
19: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `block in start'
4: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `make_proxy'
18: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:75:in `fork'
3: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:650:in `new'
2: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1101:in `initialize'
17: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:95:in `block (2 levels) in start'
1: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1807:in `to_id'
/Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1738:in `current_server' 16: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1138:in `method_missing'
: 15: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1158:in `with_friend'
DRb::DRbServerNotFound (DRb::DRbConnError 14: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1139:in `block in method_missing'
)
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:68:in `block in autorun'
13: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1227:in `open'
12: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1140:in `block (2 levels) in method_missing'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/minitest-5.14.0/lib/minitest.rb:138:in `run'
11: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:1251:in `send_message'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `start'
10: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:929:in `send_request'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `map'
9: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `send_request'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `each'
8: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:610:in `each'
6: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/gems/2.6.0/gems/activesupport-6.0.2.2/lib/active_support/testing/parallelization.rb:74:in `times'
7: from /Users/pdipietro/.rbenv/versions/2.6.3/lib/ruby/2.6.0/drb/drb.rb:611:in `block in send_request'
... more dump existed but limited to 30000 characters

This problem is due to the last minitest version (5.14). They have fixed that in rails but the fix has not been released yet
Open you gemfile and explicitly set your minitest version:
gem 'minitest', '<=5.13'
then run bundle update minitest. It will then show you what the real issue with your database in your error message.

Related

psycopg2 i got the error message "where syntax 'column' is not in list error " using pyspark

Same as title
I'm Using AWS Glue, Script, Glue3 and Python3 also using psycopg2 library
When i delete AWS Aurora(postgresql) record
my syntax is very simple, like this
tuple_list : [(value1,value2),(value1,value2), ....]
value 1 : String, value 2 : timestamp
query = "DELETE FROM {table} WHERE (col1, col2) IN ( VALUES %s)".format(table=table)
extras.execute_values(db.cur(), query, tuple_list, template="(%s, %s)", page_size=2000)
db_conn.commit()
i got the message "where syntax 'col2' is not in list error"
i have no idea why i didn't work....
thx for your interesting and have a good day
all my error message
Job aborted due to stage failure: Task 1 in stage 3.0 failed 4 times, most recent failure: Lost task 1.3 in stage 3.0 (TID 24) (IP executor 1): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/sql/types.py", line 1571, in __getattr__
idx = self.__fields__.index(item)
ValueError: 'col2' is not in list
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/worker.py", line 604, in main
process()
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/worker.py", line 594, in process
out_iter = func(split_index, iterator)
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 2916, in pipeline_func
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 2916, in pipeline_func
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 2916, in pipeline_func
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 418, in func
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 932, in func
File "/tmp/job_test", line 136, in process_partition
File "/tmp/job_test", line 88, in execute_batch_delete
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/sql/types.py", line 1576, in __getattr__
raise AttributeError(item)
AttributeError: col2
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:517)
at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:652)
at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:635)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:470)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
at org.apache.spark.InterruptibleIterator.to(InterruptibleIterator.scala:28)
at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at org.apache.spark.InterruptibleIterator.toBuffer(InterruptibleIterator.scala:28)
at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2278)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Driver stacktrace:

Unable to write spark dataframe to a parquet file format to C drive in PySpark

I am using the following command to try to write a spark (2.4.4 using Ananaconda 3 Jupyter Notebook) dataframe to a parquet file in Pyspark and get a very strange error message that I cannot resolve. I would appreciate any insights any has.
df.write.mode("overwrite").parquet("test/")
Error message is as follows:
--------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
<ipython-input-37-2b4a1d75a5f6> in <module>()
1 # df.write.partitionBy("AB").parquet("C:/test.parquet",mode='overwrite')
----> 2 df.write.mode("overwrite").parquet("test/")
3 # df.write.mode('SaveMode.Overwrite').parquet("C:/test.parquet")
C:\spark-2.4.4-bin-hadoop2.7\python\pyspark\sql\readwriter.py in parquet(self, path, mode, partitionBy, compression)
841 self.partitionBy(partitionBy)
842 self._set_opts(compression=compression)
--> 843 self._jwrite.parquet(path)
844
845 #since(1.6)
C:\spark-2.4.4-bin-hadoop2.7\python\lib\py4j-0.10.7-src.zip\py4j\java_gateway.py in __call__(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
-> 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:
C:\spark-2.4.4-bin-hadoop2.7\python\pyspark\sql\utils.py in deco(*a, **kw)
61 def deco(*a, **kw):
62 try:
---> 63 return f(*a, **kw)
64 except py4j.protocol.Py4JJavaError as e:
65 s = e.java_exception.toString()
C:\spark-2.4.4-bin-hadoop2.7\python\lib\py4j-0.10.7-src.zip\py4j\protocol.py in get_return_value(answer, gateway_client, target_id, name)
326 raise Py4JJavaError(
327 "An error occurred while calling {0}{1}{2}.\n".
--> 328 format(target_id, ".", name), value)
329 else:
330 raise Py4JError(
Py4JJavaError: An error occurred while calling o862.parquet.
: org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:198)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:159)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:566)
at sun.reflect.GeneratedMethodAccessor114.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 52.0 failed 1 times, most recent failure: Lost task 0.0 in stage 52.0 (TID 176, localhost, executor driver): java.io.IOException: (null) entry in command string: null chmod 0644 C:\Users\583621\OneDrive - Booz Allen Hamilton\Personal\Teaching\PySpark Essentials for Data Scientists\PySpark DataFrame Essentials\test\_temporary\0\_temporary\attempt_20191206164455_0052_m_000000_176\part-00000-2cd01dbe-9e3f-44a5-88e1-e904822024c2-c000.snappy.parquet
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.parquet.hadoop.util.HadoopOutputFile.create(HadoopOutputFile.java:74)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:248)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:390)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:349)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:151)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:236)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:167)
... 32 more
Caused by: java.io.IOException: (null) entry in command string: null chmod 0644 C:\Users\583621\OneDrive - Booz Allen Hamilton\Personal\Teaching\PySpark Essentials for Data Scientists\PySpark DataFrame Essentials\test\_temporary\0\_temporary\attempt_20191206164455_0052_m_000000_176\part-00000-2cd01dbe-9e3f-44a5-88e1-e904822024c2-c000.snappy.parquet
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.parquet.hadoop.util.HadoopOutputFile.create(HadoopOutputFile.java:74)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:248)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:390)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:349)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:151)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:236)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
... 1 more
1
# Now something a bit more complicated: Read in a full parquet
2
parquet
You need to set a Hadoop home.
You can get the WINUTILS.EXE binary from a Hadoop redistribution. There is a repository of this for some Hadoop versions on github.
Then
1) Either You can Set the environment variable %HADOOP_HOME% to point to the directory above the BIN dir containing WINUTILS.EXE.
2)or Configure in code as
import sys
import os
os.environ['HADOOP_HOME'] = "C:/Mine/Spark/hadoop-2.6.0"
sys.path.append("C:/Mine/Spark/hadoop-2.6.0/bin")
Hope this helps !

Bag of words with pySpark reduceByKey

I am trying to do some text mining tasks with pySpark. I am new to Spark and I've been following this example http://mccarroll.net/blog/pyspark2/index.html to build the bag of words for my data.
Originally my data looked something like this
df.show(5)
+------------+---------+----------------+--------------------+
|Title |Month | Author | Document|
+------------+---------+----------------+--------------------+
| a | Jan| John |This is a document |
| b | Feb| Mary |A book by Mary |
| c | Mar| Luke |Newspaper article |
+------------+---------+----------------+--------------------+
So far I have extracted the terms of each document with
bow0 = df.rdd\
.map( lambda x: x.Document.replace(',',' ').replace('.',' ').replace('-',' ').lower())\
.flatMap(lambda x: x.split())\
.map(lambda x: (x, 1))
Which gives me
[('This', 1),
('is', 1),
('a', 1),
('document', 1)]
But when I try to compute the frequency with reduceByKey and try to see the result
bow0.reduceByKey(lambda x,y:x+y).take(50)
I get this error:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
<ipython-input-53-966f90775397> in <module>()
----> 1 bow0.reduceByKey(lambda x,y:x+y).take(50)
/usr/local/spark/python/pyspark/rdd.py in take(self, num)
1341
1342 p = range(partsScanned, min(partsScanned + numPartsToTry, totalParts))
-> 1343 res = self.context.runJob(self, takeUpToNumLeft, p)
1344
1345 items += res
/usr/local/spark/python/pyspark/context.py in runJob(self, rdd, partitionFunc, partitions, allowLocal)
990 # SparkContext#runJob.
991 mappedRDD = rdd.mapPartitions(partitionFunc)
--> 992 port = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd, partitions)
993 return list(_load_from_socket(port, mappedRDD._jrdd_deserializer))
994
/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py in __call__(self, *args)
1131 answer = self.gateway_client.send_command(command)
1132 return_value = get_return_value(
-> 1133 answer, self.gateway_client, self.target_id, self.name)
1134
1135 for temp_arg in temp_args:
/usr/local/spark/python/pyspark/sql/utils.py in deco(*a, **kw)
61 def deco(*a, **kw):
62 try:
---> 63 return f(*a, **kw)
64 except py4j.protocol.Py4JJavaError as e:
65 s = e.java_exception.toString()
/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
317 raise Py4JJavaError(
318 "An error occurred while calling {0}{1}{2}.\n".
--> 319 format(target_id, ".", name), value)
320 else:
321 raise Py4JError(
Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 31.0 failed 4 times, most recent failure: Lost task 1.3 in stage 31.0 (TID 84, 9.242.64.15, executor 7): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/worker.py", line 177, in main
process()
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/worker.py", line 172, in process
serializer.dump_stream(func(split_index, iterator), outfile)
File "/usr/local/spark/python/pyspark/rdd.py", line 2423, in pipeline_func
return func(split, prev_func(split, iterator))
File "/usr/local/spark/python/pyspark/rdd.py", line 2423, in pipeline_func
return func(split, prev_func(split, iterator))
File "/usr/local/spark/python/pyspark/rdd.py", line 346, in func
return f(iterator)
File "/usr/local/spark/python/pyspark/rdd.py", line 1842, in combineLocally
merger.mergeValues(iterator)
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/shuffle.py", line 236, in mergeValues
for k, v in iterator:
File "<ipython-input-48-5c0753c6b152>", line 1, in <lambda>
AttributeError: 'NoneType' object has no attribute 'replace'
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:404)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1517)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1505)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1504)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1504)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1732)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1687)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1676)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2029)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2050)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2069)
at org.apache.spark.api.python.PythonRDD$.runJob(PythonRDD.scala:455)
at org.apache.spark.api.python.PythonRDD.runJob(PythonRDD.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/worker.py", line 177, in main
process()
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/worker.py", line 172, in process
serializer.dump_stream(func(split_index, iterator), outfile)
File "/usr/local/spark/python/pyspark/rdd.py", line 2423, in pipeline_func
return func(split, prev_func(split, iterator))
File "/usr/local/spark/python/pyspark/rdd.py", line 2423, in pipeline_func
return func(split, prev_func(split, iterator))
File "/usr/local/spark/python/pyspark/rdd.py", line 346, in func
return f(iterator)
File "/usr/local/spark/python/pyspark/rdd.py", line 1842, in combineLocally
merger.mergeValues(iterator)
File "/usr/local/spark/python/lib/pyspark.zip/pyspark/shuffle.py", line 236, in mergeValues
for k, v in iterator:
File "<ipython-input-48-5c0753c6b152>", line 1, in <lambda>
AttributeError: 'NoneType' object has no attribute 'replace'
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:404)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
To expand on my comment, the error you are receiving is due to the presence of a null value in your Document column. Here's a small example to demonstrate:
data = [
['a', 'Jan', 'John', 'This is a document'],
['b', 'Feb', 'Mary', 'A book by Mary'],
['c', 'Mar', 'Luke', 'Newspaper article'],
['d', 'Apr', 'Mark', None]
]
columns = ['Title', 'Month', 'Author', 'Document']
df = spark.createDataFrame(data, columns)
df.show()
#+-----+-----+------+------------------+
#|Title|Month|Author| Document|
#+-----+-----+------+------------------+
#| a| Jan| John|This is a document|
#| b| Feb| Mary| A book by Mary|
#| c| Mar| Luke| Newspaper article|
#| d| Apr| Mark| null|
#+-----+-----+------+------------------+
For the last row, the value in the Document column is null. When you compute bow0 as in your question, when the map function operates on that row it tries to call x.Document.replace where x is None. This results in AttributeError: 'NoneType' object has no attribute 'replace'.
One way to overcome this is to filter out the bad values before calling map:
bow0 = df.rdd\
.filter(lambda x: x.Document)\
.map( lambda x: x.Document.replace(',',' ').replace('.',' ').replace('-',' ').lower())\
.flatMap(lambda x: x.split())\
.map(lambda x: (x, 1))
bow0.reduceByKey(lambda x,y:x+y).take(50)
#[(u'a', 2),
# (u'this', 1),
# (u'is', 1),
# (u'newspaper', 1),
# (u'article', 1),
# (u'by', 1),
# (u'book', 1),
# (u'mary', 1),
# (u'document', 1)]
Or you can build in the check for None condition inside of your map function. In general, it is good practice to make your map function robust to bad inputs.
As an aside, you can do the same thing using the DataFrame API functions. In this case:
from pyspark.sql.functions import explode, split, regexp_replace, col, lower
df.select(explode(split(regexp_replace("Document", "[,.-]", " "), "\s+")).alias("word"))\
.groupby(lower(col("word")).alias("lower"))\
.count()\
.show()
#+---------+-----+
#| lower|count|
#+---------+-----+
#| document| 1|
#| by| 1|
#|newspaper| 1|
#| article| 1|
#| mary| 1|
#| is| 1|
#| a| 2|
#| this| 1|
#| book| 1|
#+---------+-----+

Capistrano 3.4.0 - stopped deploying

I know there are lots of topics regarding Capistrano, but none helped in our case.
Capistrano 3.4.0 suddenly stopped deploying.
The error we get:
INFO [8fd2b54a] Running /usr/bin/env mkdir -p /tmp/000.com/ as deploy#000.com
DEBUG [8fd2b54a] Command: /usr/bin/env mkdir -p /tmp/000.com/
^C(Backtrace restricted to imported tasks)
cap aborted!
Interrupt:
Tasks: TOP => git:check => git:wrapper
(See full trace by running task with --trace)
The deploy has failed with an error:
I tried several recipes, but can't get what is wrong.
Do you have any ideas?
Now I've got an error:
The deploy has failed with an error: Net::SSH::ConnectionTimeout
deploy#ip-000:~/dep$ cap production deploy --trace
** Invoke production (first_time)
** Execute production
** Invoke load:defaults (first_time)
** Execute load:defaults
** Invoke deploy (first_time)
** Execute deploy
** Invoke deploy:starting (first_time)
** Execute deploy:starting
** Invoke deploy:check (first_time)
** Execute deploy:check
** Invoke git:check (first_time)
** Invoke git:wrapper (first_time)
** Execute git:wrapper
INFO [5c3bfd8b] Running /usr/bin/env mkdir -p /tmp/00/ as deploy#00.com
DEBUG [5c3bfd8b] Command: /usr/bin/env mkdir -p /tmp/00.com/
^Ccap aborted!
Interrupt:
/home/deploy/.rvm/gems/ruby-2.2.3/gems/sshkit-1.8.1/lib/sshkit/runners/parallel. rb:20:in `join'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/sshkit-1.8.1/lib/sshkit/runners/parallel. rb:20:in `map'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/sshkit-1.8.1/lib/sshkit/runners/parallel. rb:20:in `execute'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/sshkit-1.8.1/lib/sshkit/coordinator.rb:21 :in `each'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/dsl.rb:55 :in `on'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/git .rake:16:in `block (2 levels) in <top (required)>'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `call'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `block in execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `each'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:179:in `block in invoke_with_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/monitor.rb:211:in `mon_synchr onize'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:172:in `invoke_w ith_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:201:in `block in invoke_prerequisites'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:199:in `each'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:199:in `invoke_p rerequisites'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:178:in `block in invoke_with_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/monitor.rb:211:in `mon_synchr onize'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:172:in `invoke_w ith_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:165:in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/dsl.rb:16 :in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/dep loy.rake:36:in `block (2 levels) in <top (required)>'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `call'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `block in execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `each'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:179:in `block in invoke_with_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/monitor.rb:211:in `mon_synchr onize'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:172:in `invoke_w ith_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:165:in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/dsl.rb:16 :in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/dep loy.rake:4:in `block (2 levels) in <top (required)>'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `call'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `block in execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `each'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:179:in `block in invoke_with_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/monitor.rb:211:in `mon_synchr onize'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:172:in `invoke_w ith_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:165:in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/dsl.rb:16 :in `invoke'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/fra mework.rake:65:in `block (2 levels) in <top (required)>'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/fra mework.rake:64:in `each'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/tasks/fra mework.rake:64:in `block in <top (required)>'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `call'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:240:in `block in execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `each'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:235:in `execute'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:179:in `block in invoke_with_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/monitor.rb:211:in `mon_synchr onize'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:172:in `invoke_w ith_call_chain'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/task.rb:165:in `invoke'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:150:in `i nvoke_task'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:106:in `b lock (2 levels) in top_level'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:106:in `e ach'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:106:in `b lock in top_level'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:115:in `r un_with_threads'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:100:in `t op_level'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:78:in `bl ock in run'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:176:in `s tandard_exception_handling'
/home/deploy/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/rake/application.rb:75:in `ru n'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/lib/capistrano/applicati on.rb:15:in `run'
/home/deploy/.rvm/gems/ruby-2.2.3/gems/capistrano-3.4.0/bin/cap:3:in `<top (requ ired)>'
/home/deploy/.rvm/gems/ruby-2.2.3/bin/cap:23:in `load'
/home/deploy/.rvm/gems/ruby-2.2.3/bin/cap:23:in `<main>'
/home/deploy/.rvm/gems/ruby-2.2.3/bin/ruby_executable_hooks:15:in `eval'
/home/deploy/.rvm/gems/ruby-2.2.3/bin/ruby_executable_hooks:15:in `<main>'
Tasks: TOP => git:check => git:wrapper
The deploy has failed with an error:
** Invoke deploy:failed (first_time)
** Execute deploy:failed
Can you paste your deploy.rb in particulary the set :repo_url, part ?
It seems, your repo is not reacheable from your server, check git command and ssh-agent
It's a private repo ?

Instance was not initialized, using default configuration

1) Bundle identifier same as one at linked apps
2) Using a tester account
3) Window->G Play Games->Setup->Android Setup - done
4) SDK Manager's packages up to date, play-games-plugin-for-unity up to date
5) My setup code(taken from the Minimal sample and modified a litte bit):
void Start()
{
PlayGamesPlatform.DebugLogEnabled = true;
PlayGamesPlatform.Activate();
Social.localUser.Authenticate(success =>
{
Debug.Log(success);
});
var config = new PlayGamesClientConfiguration.Builder().Build();
PlayGamesPlatform.InitializeInstance(config);
}
I get the following logcat:
05-11 05:52:22.719 666-676/? I/ActivityManager﹕ START u0 {act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity bnds=[240,427][240,427]} from pid 1212
05-11 05:52:22.744 666-1206/? I/ActivityManager﹕ Start proc com.Toughwin.MemoryRun for activity com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity: pid=937 uid=10305 gids={50305, 3003}
05-11 05:52:23.098 937-937/? D/ActivityThread﹕ ACT-AM_ON_RESUME_CALLED ActivityRecord{41fdc2c8 token=android.os.BinderProxy#41fdba30 {com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}}
05-11 05:52:23.103 937-937/? D/ActivityThread﹕ ACT-LAUNCH_ACTIVITY handled : 0 / ActivityRecord{41fdc2c8 token=android.os.BinderProxy#41fdba30 {com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}}
05-11 05:52:23.114 137-137/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:-1,c:137) setConsumerName: com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity
05-11 05:52:23.114 137-137/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:-1,c:137) setDefaultBufferSize: w=854, h=480
05-11 05:52:23.118 666-684/? I/WindowManager﹕ Gaining focus: Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
05-11 05:52:23.197 137-338/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:937,c:137) connect: api=2 producer=(937:com.Toughwin.MemoryRun) producerControlledByApp=true
05-11 05:52:23.197 137-336/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:2,p:937,c:137) new GraphicBuffer needed
05-11 05:52:23.218 137-137/? I/GLConsumer﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e65b30,api:2) [void* android::GLConsumer::createImage(EGLDisplay, const android::sp<android::GraphicBuffer>&, const android::Rect&)]
05-11 05:52:23.224 666-684/? I/ActivityManager﹕ [AppLaunch] Displayed Displayed com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity: +491ms
05-11 05:52:23.224 666-684/? D/ActivityManager﹕ AP_PROF:AppLaunch_LaunchTime:com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity:491:152769531
05-11 05:52:23.370 937-963/? D/MALI﹕ #06 pc 0049f42c /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.370 937-963/? D/MALI﹕ #07 pc 0049f2f0 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.370 937-963/? D/MALI﹕ #08 pc 0049fa98 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.370 937-963/? D/MALI﹕ #09 pc 002e10e8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.370 937-963/? D/MALI﹕ #10 pc 004b7c9c /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.370 937-963/? D/MALI﹕ #11 pc 004b90a8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.371 937-963/? D/MALI﹕ #12 pc 004bde28 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.371 937-963/? E/Unity﹕ [EGL] eglChooseConfig(m_EGLDisplay, configAttribs, NULL, 0, &eglConfigCount): EGL_BAD_ATTRIBUTE: An unrecognized attribute or attribute value was passed in the attribute list.
(Filename: ./Runtime/GfxDevice/egl/ConfigEGL.cpp Line: 222)
05-11 05:52:23.374 937-963/? D/MALI﹕ #06 pc 004a08c0 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #07 pc 004a0c88 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #08 pc 0049f550 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #09 pc 0049fb00 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #10 pc 002e10e8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #11 pc 004b7c9c /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #12 pc 004b90a8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.374 937-963/? D/MALI﹕ #13 pc 004bde28 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #06 pc 004a08c0 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #07 pc 004a0ca0 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #08 pc 0049f550 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #09 pc 0049fb00 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #10 pc 002e10e8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #11 pc 004b7c9c /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #12 pc 004b90a8 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.377 937-963/? D/MALI﹕ #13 pc 004bde28 /data/app-lib/com.Toughwin.MemoryRun-1/libunity.so
05-11 05:52:23.391 937-963/? D/Unity﹕ GL_EXT_debug_marker GL_OES_texture_npot GL_OES_compressed_ETC1_RGB8_texture GL_OES_standard_derivatives GL_OES_EGL_image GL_OES_depth24 GL_ARM_rgba8 GL_ARM_mali_shader_binary GL_OES_depth_texture GL_OES_packed_depth_stencil GL_EXT_texture_format_BGRA8888 GL_OES_vertex_half_float GL_EXT_blend_minmax GL_OES_EGL_image_external GL_OES_EGL_sync GL_OES_rgb8_rgba8 GL_EXT_multisampled_render_to_texture GL_EXT_discard_framebuffer GL_OES_get_program_binary GL_ARM_mali_program_binary GL_EXT_shader_texture_lod GL_EXT_robustness GL_OES_depth_texture_cube_map GL_KHR_debug
05-11 05:52:23.569 133-598/? D/FrameworkListener﹕ dispatchCommand data = (getaddrinfo stats.unity3d.com ^ 1024 0 1 0 ^)
05-11 05:52:23.570 133-990/? D/libc-netbsd﹕ res_queryN name = stats.unity3d.com, class = 1, type = 1
05-11 05:52:23.570 133-990/? D/libc﹕ QUERY: RECURSIVE stats.unity3d.com (A)
05-11 05:52:23.634 133-990/? D/libc﹕ QUERY: RECURSIVE stats.unity3d.com (A)
05-11 05:52:23.634 133-990/? D/libc﹕ ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 61990
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 3, ADDITIONAL: 3
;; stats.unity3d.com, type = A, class = IN
stats.unity3d.com. 1H IN A 62.116.219.119
unity3d.com. 44m24s IN NS ns01.unity3d.com.
unity3d.com. 44m24s IN NS ns02.unity3d.com.
unity3d.com. 44m24s IN NS ns03.unity3d.com.
ns01.unity3d.com. 1h44m IN A 62.116.219.114
ns02.unity3d.com. 1h44m IN A 75.126.59.154
ns03.unity3d.com. 1h44m IN A 54.248.81.61
1h3m22ih
05-11 05:52:23.635 133-990/? D/libc﹕ f2 26 81 80 00 01 00 01 00 03 00 03 05 73 74 61 .&...........sta
74 73 07 75 6e 69 74 79 33 64 03 63 6f 6d 00 00 ts.unity3d.com..
01 00 01 c0 0c 00 01 00 01 00 00 0e 10 00 04 3e ...............>
74 db 77 c0 12 00 02 00 01 00 00 0a 68 00 07 04 t.w.........h...
6e 73 30 31 c0 12 c0 12 00 02 00 01 00 00 0a 68 ns01...........h
00 07 04 6e 73 30 32 c0 12 c0 12 00 02 00 01 00 ...ns02.........
00 0a 68 00 07 04 6e 73 30 33 c0 12 c0 3f 00 01 ..h...ns03...?..
00 01 00 00 18 60 00 04 3e 74 db 72 c0 52 00 01 .....`..>t.r.R..
00 01 00 00 18 60 00 04 4b 7e 3b 9a c0 65 00 01 .....`..K~;..e..
00 01 00 00 18 60 00 04 36 f8 51 3d .....`..6.Q=
05-11 05:52:23.635 133-990/? D/libc-netbsd﹕ res_queryN name = stats.unity3d.com succeed
05-11 05:52:23.635 937-986/? D/libc-netbsd﹕ getaddrinfo: stats.unity3d.com get result from proxy >>
05-11 05:52:23.637 937-986/? I/System.out﹕ [socket][0] connection stats.unity3d.com/62.116.219.119:80;LocalPort=45598(0)
05-11 05:52:23.637 937-986/? I/System.out﹕ [CDS]connect[stats.unity3d.com/62.116.219.119:80] tm:90
05-11 05:52:24.318 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Activating PlayGamesPlatform.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.321 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Instance was not initialized, using default configuration.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.412 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: PlayGamesPlatform activated: GooglePlayGames.PlayGamesPlatform
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.414 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Creating platform-specific Play Games client.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.416 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Creating real IPlayGamesClient
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.496 937-963/? D/GamesUnitySDK﹕ Performing Android initialization of the GPG SDK
05-11 05:52:24.749 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Entering state: BeforeRoomCreateStartedState
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.750 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: BeforeRoomCreateStartedState.OnStateEntered: Defaulting to no-op.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.752 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: Entering state: ShutdownState
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.752 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 DEBUG: ShutdownState.OnStateEntered: Defaulting to no-op.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:24.762 937-963/? W/Unity﹕ !!! [Play Games Plugin DLL] 05/11/15 5:52:24 +03:00 WARNING: PlayGamesPlatform already initialized. Ignoring this call.
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:25.024 937-1000/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:25 +03:00 DEBUG: Starting Auth Transition. Op: SIGN_IN status: ERROR_NOT_AUTHORIZED
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:25.093 666-684/? I/WindowManager﹕ Losing focus: Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
05-11 05:52:25.142 937-937/? D/ActivityThread﹕ ACT-AM_ON_PAUSE_CALLED ActivityRecord{41fdc2c8 token=android.os.BinderProxy#41fdba30 {com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}}
05-11 05:52:30.209 937-937/? D/ActivityThread﹕ ACT-AM_ON_RESUME_CALLED ActivityRecord{41fdc2c8 token=android.os.BinderProxy#41fdba30 {com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}}
05-11 05:52:30.225 666-684/? I/WindowManager﹕ Gaining focus: Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
05-11 05:52:33.932 937-1000/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:33 +03:00 DEBUG: Starting Auth Transition. Op: SIGN_IN status: ERROR_NOT_AUTHORIZED
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:33.967 937-963/? I/Unity﹕ [Play Games Plugin DLL] 05/11/15 5:52:33 +03:00 DEBUG: Invoking user callback on game thread
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:52:33.968 937-963/? I/Unity﹕ False
(Filename: ./artifacts/generated/common/runtime/UnityEngineDebug.gen.cpp Line: 56)
05-11 05:54:16.504 937-937/? D/ActivityThread﹕ ACT-AM_ON_PAUSE_CALLED ActivityRecord{41fdc2c8 token=android.os.BinderProxy#41fdba30 {com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}}
05-11 05:54:16.557 137-137/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:2,p:937,c:137) setDefaultBufferSize: w=480, h=854
05-11 05:54:16.568 666-684/? I/WindowManager﹕ Losing focus: Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
05-11 05:54:16.622 137-618/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:2,p:-1,c:137) disconnect: api=2
05-11 05:54:16.622 137-618/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:-1,c:137) getReleasedBuffers: returning mask 0xffffffff
05-11 05:54:16.622 137-618/? I/GLConsumer﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e65b30,api:0) destroying EGLImage dpy=0x1 img=0x10000004
05-11 05:54:16.629 137-137/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:-1,c:-1) consumerDisconnect
05-11 05:54:16.630 137-137/? I/BufferQueue﹕ [com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity](this:0xb8e9bab8,id:1863,api:0,p:-1,c:-1) ~BufferQueue
05-11 05:54:18.535 666-22270/? I/WindowState﹕ WIN DEATH: Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
05-11 05:54:18.535 666-22270/? W/WindowManager﹕ Force-removing child win Window{421009f8 u0 SurfaceView} from container Window{420f9d90 u0 com.Toughwin.MemoryRun/com.unity3d.player.UnityPlayerActivity}
Do I miss something? What could I possibly be missing?
SUCCESS!
Add a Keystore.(only if you've uploaded the apk to the dev console)
I was missing to add the keystore to my app, since I uploaded the app to the dev console and in order to be able to upload it, a keystore was needed.