When I run unittests with django-simple-history==1.6.0, it runs fine but with
django-simple-history==1.9.0 it breaks the unittests.
Any ideas how to fix this?
Creating test database for alias 'default'...
Traceback (most recent call last):
File "manage.py", line 12, in <module>
execute_from_command_line(sys.argv)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/__init__.py", line 338, in execute_from_command_line
utility.execute()
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/__init__.py", line 330, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/commands/test.py", line 30, in run_from_argv
super(Command, self).run_from_argv(argv)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/base.py", line 390, in run_from_argv
self.execute(*args, **cmd_options)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/commands/test.py", line 74, in execute
super(Command, self).execute(*args, **options)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/base.py", line 441, in execute
output = self.handle(*args, **options)
File "/home/me/venv/lib/python2.7/site-packages/django/core/management/commands/test.py", line 90, in handle
failures = test_runner.run_tests(test_labels)
File "/home/me/venv/lib/python2.7/site-packages/django/test/runner.py", line 210, in run_tests
old_config = self.setup_databases()
File "/home/me/venv/lib/python2.7/site-packages/django/test/runner.py", line 166, in setup_databases
**kwargs
File "/home/me/venv/lib/python2.7/site-packages/django/test/runner.py", line 370, in setup_databases
serialize=connection.settings_dict.get("TEST", {}).get("SERIALIZE", True),
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/base/creation.py", line 376, in create_test_db
self.connection._test_serialized_contents = self.serialize_db_to_string()
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/base/creation.py", line 413, in serialize_db_to_string
serializers.serialize("json", get_objects(), indent=None, stream=out)
File "/home/me/venv/lib/python2.7/site-packages/django/core/serializers/__init__.py", line 129, in serialize
s.serialize(queryset, **options)
File "/home/me/venv/lib/python2.7/site-packages/django/core/serializers/base.py", line 52, in serialize
for obj in queryset:
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/base/creation.py", line 409, in get_objects
for obj in queryset.iterator():
File "/home/me/venv/lib/python2.7/site-packages/django/db/models/query.py", line 238, in iterator
results = compiler.execute_sql()
File "/home/me/venv/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 829, in execute_sql
cursor.execute(sql, params)
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
File "/home/me/venv/lib/python2.7/site-packages/django/db/utils.py", line 97, in __exit__
six.reraise(dj_exc_type, dj_exc_value, traceback)
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
File "/home/me/venv/lib/python2.7/site-packages/django/db/backends/sqlite3/base.py", line 318, in execute
return Database.Cursor.execute(self, query, params)
django.db.utils.OperationalError: no such column: myapp_historicalbook.history_change_reason
I got it. I had to remove my migration folder in order to be able to run the unit tests.
Related
I am trying to import .zip Studio Customization file to Odoo Version 15. As we are migrating our Odoo System we want transfer our Studio Customizations from the development Database to the Production Database.I am getting the following Error:
1.
Traceback (most recent call last):
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/cache.py", line 85, in lookup
r = d[key]
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/func.py", line 71, in wrapper
return func(self, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/lru.py", line 34, in __getitem__
a = self.d[obj]
KeyError: ('ir.model.access', <function IrModelAccess.check at 0x7f330bb4f4c0>, 2, False, 'base.import.module', 'write', True, ('en_US',))
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/secusmart/odoo15-prod/odoo/odoo/addons/base/models/ir_http.py", line 237, in _dispatch
result = request.dispatch()
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 688, in dispatch
result = self._call_function(**self.params)
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 360, in _call_function
return checked_call(self.db, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/service/model.py", line 94, in wrapper
return f(dbname, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 349, in checked_call
result = self.endpoint(*a, **kw)
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 917, in __call__
return self.method(*args, **kw)
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 536, in response_wrap
response = f(*args, **kw)
File "/home/secusmart/odoo15-prod/odoo/addons/web/controllers/main.py", line 1352, in call_button
action = self._call_kw(model, method, args, kwargs)
File "/home/secusmart/odoo15-prod/odoo/addons/web/controllers/main.py", line 1340, in _call_kw
return call_kw(request.env[model], method, args, kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/api.py", line 464, in call_kw
result = _call_kw_multi(method, model, args, kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/api.py", line 451, in _call_kw_multi
result = method(recs, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/addons/base_import_module/models/base_import_module.py", line 24, in import_module
self.write({'state': 'done', 'import_message': res[0]})
File "/home/secusmart/odoo15-prod/odoo/odoo/models.py", line 3763, in write
self.check_access_rights('write')
File "/home/secusmart/odoo15-prod/odoo/odoo/models.py", line 3538, in check_access_rights
return self.env['ir.model.access'].check(self._name, operation, raise_exception)
File "<decorator-gen-33>", line 2, in check
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/cache.py", line 90, in lookup
value = d[key] = self.method(*args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/addons/base/models/ir_model.py", line 1762, in check
self._cr.execute("""SELECT MAX(CASE WHEN perm_{mode} THEN 1 ELSE 0 END)
File "<decorator-gen-3>", line 2, in execute
File "/home/secusmart/odoo15-prod/odoo/odoo/sql_db.py", line 89, in check
return f(self, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/sql_db.py", line 310, in execute
res = self._obj.execute(query, params)
Exception
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 644, in _handle_exception
return super(JsonRequest, self)._handle_exception(exception)
File "/home/secusmart/odoo15-prod/odoo/odoo/http.py", line 302, in _handle_exception
raise exception.with_traceback(None) from new_cause
psycopg2.errors.InFailedSqlTransaction: current transaction is aborted, commands ignored until end of transaction block
I fixed the first error by performing the import as superuser
Now I got the following Error:
2.
Traceback (most recent call last):
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/cache.py", line 85, in lookup
r = d[key]
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/func.py", line 71, in wrapper
return func(self, *args, **kwargs)
File "/home/secusmart/odoo15-prod/odoo/odoo/tools/lru.py", line 34, in __getitem__
a = self.d[obj]
KeyError: ('res.lang', <function Lang.get_installed at 0x7f330ab8b4c0>)
You can refer to this answer: Is it possible to import Odoo15 Studio .zip file to Odoo16?
As a small recap, basically you should use the migration tools offered by odoo. You have an enterprise subscription and are using a version of Odoo that is still supported. So migrating your database by Odoo employees should already be possible - and free.
Description
I am currently implementing a batch job with dataflow using apache beam, this works fine when i use direct runner. Changing this throws the below exception, not sure where the error is comming from.
Code Snippet
enter image description here
Error:
raceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/beam_mysql/connector/client.py", line 179, in enter
self.conn = mysql.connector.connect(**self._config)
File "/usr/local/lib/python3.9/site-packages/mysql/connector/pooling.py", line 286, in connect
return CMySQLConnection(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/mysql/connector/connection_cext.py", line 101, in init
self.connect(**kwargs)
File "/usr/local/lib/python3.9/site-packages/mysql/connector/abstracts.py", line 1095, in connect
self._open_connection()
File "/usr/local/lib/python3.9/site-packages/mysql/connector/connection_cext.py", line 268, in _open_connection
raise get_mysql_exception(
mysql.connector.errors.DatabaseError: 2003 (HY000): Can't connect to MySQL server on '127.0.0.1:41849' (111)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py", line 284, in _execute
response = task()
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py", line 357, in
lambda: self.create_worker().do_instruction(request), request)
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py", line 597, in do_instruction
return getattr(self, request_type)(
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py", line 635, in process_bundle
bundle_processor.process_bundle(instruction_id))
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1003, in process_bundle
input_op_by_transform_id[element.transform_id].process_encoded(
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/bundle_processor.py", line 227, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 526, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 528, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 237, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1419, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1491, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1417, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 623, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1581, in apache_beam.runners.common._OutputHandler.handle_process_outputs
File "apache_beam/runners/common.py", line 1694, in apache_beam.runners.common._OutputHandler._write_value_to_tag
File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1419, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1491, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1417, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 623, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1581, in apache_beam.runners.common._OutputHandler.handle_process_outputs
File "apache_beam/runners/common.py", line 1694, in apache_beam.runners.common._OutputHandler._write_value_to_tag
File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1419, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1507, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1417, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 623, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1571, in apache_beam.runners.common._OutputHandler.handle_process_outputs
File "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1454, in process
for part, size in self.restriction_provider.split_and_size(
File "/usr/local/lib/python3.9/site-packages/apache_beam/transforms/core.py", line 331, in split_and_size
for part in self.split(element, restriction):
File "/usr/local/lib/python3.9/site-packages/apache_beam/io/iobase.py", line 1641, in split
estimated_size = restriction.source().estimate_size()
File "/usr/local/lib/python3.9/site-packages/beam_mysql/connector/source.py", line 49, in estimate_size
return self._splitter.estimate_size()
File "/usr/local/lib/python3.9/site-packages/beam_mysql/connector/splitters.py", line 48, in estimate_size
return self.source.client.rough_counts_estimator(self.source.query)
File "/usr/local/lib/python3.9/site-packages/beam_mysql/connector/client.py", line 104, in rough_counts_estimator
with _MySQLConnection(self._config) as conn:
File "/usr/local/lib/python3.9/site-packages/beam_mysql/connector/client.py", line 182, in enter
raise MySQLClientError(f"Failed to connect mysql, Raise exception: {e}")
beam_mysql.connector.errors.MySQLClientError: Failed to connect mysql, Raise exception: 2003 (HY000): Can't connect to MySQL server on '127.0.0.1:36247' (111) [while running 'ref_AppliedPTransform_Read-From-Mysql-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing-ptransform-38']
You have a connection problem to MySql :
Your traceback indicates :
mysql.connector.errors.DatabaseError: 2003 (HY000): Can't connect to MySQL server on '127.0.0.1:41849'
https://dev.mysql.com/doc/refman/8.0/en/can-not-connect-to-server.html#:~:text=The%20error%20(2003)%20Can',one%20configured%20on%20the%20server.
Check your connection to your MySql server please and also the parameters passed by your Beam job.
I am trying to install Airflow 2.0.1 with ansible on CentOS8 machine. Python version 3.8.1. I made pip 20.2.4 as suggested in Airflow docs.
I am using postgresql and airflow db check is successful. But the db init task gives the following error. I tried airflow db init manually but the result was the same:
ERROR - Failed to add operation for GET /api/v1/connections
Traceback (most recent call last):
File "/opt/airflow/lib/python3.8/site-packages/connexion/apis/abstract.py", line 209, in add_paths
self.add_operation(path, method)
File "/opt/airflow/lib/python3.8/site-packages/connexion/apis/abstract.py", line 162, in add_operation
operation = make_operation(
File "/opt/airflow/lib/python3.8/site-packages/connexion/operations/__init__.py", line 8, in make_operation
return spec.operation_cls.from_spec(spec, *args, **kwargs)
File "/opt/airflow/lib/python3.8/site-packages/connexion/operations/openapi.py", line 128, in from_spec
return cls(
File "/opt/airflow/lib/python3.8/site-packages/connexion/operations/openapi.py", line 75, in __init__
super(OpenAPIOperation, self).__init__(
File "/opt/airflow/lib/python3.8/site-packages/connexion/operations/abstract.py", line 96, in __init__
self._resolution = resolver.resolve(self)
File "/opt/airflow/lib/python3.8/site-packages/connexion/resolver.py", line 40, in resolve
return Resolution(self.resolve_function_from_operation_id(operation_id), operation_id)
File "/opt/airflow/lib/python3.8/site-packages/connexion/resolver.py", line 66, in resolve_function_from_operation_id
raise ResolverError(str(e), sys.exc_info())
connexion.exceptions.ResolverError: <ResolverError: columns>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/airflow/bin/airflow", line 8, in <module>
sys.exit(main())
File "/opt/airflow/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/opt/airflow/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/opt/airflow/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb
db.initdb()
File "/opt/airflow/lib/python3.8/site-packages/airflow/utils/db.py", line 549, in initdb
upgradedb()
File "/opt/airflow/lib/python3.8/site-packages/airflow/utils/db.py", line 684, in upgradedb
command.upgrade(config, 'heads')
File "/opt/airflow/lib/python3.8/site-packages/alembic/command.py", line 294, in upgrade
script.run_env()
File "/opt/airflow/lib/python3.8/site-packages/alembic/script/base.py", line 490, in run_env
util.load_python_file(self.dir, "env.py")
File "/opt/airflow/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 97, in load_python_file
module = load_module_py(module_id, path)
File "/opt/airflow/lib/python3.8/site-packages/alembic/util/compat.py", line 182, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/opt/airflow/lib/python3.8/site-packages/airflow/migrations/env.py", line 108, in <module>
run_migrations_online()
File "/opt/airflow/lib/python3.8/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/opt/airflow/lib/python3.8/site-packages/alembic/runtime/environment.py", line 813, in run_migrations
self.get_context().run_migrations(**kw)
File "/opt/airflow/lib/python3.8/site-packages/alembic/runtime/migration.py", line 561, in run_migrations
step.migration_fn(**kw)
File "/opt/airflow/lib/python3.8/site-packages/airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", line 314, in upgrade
remap_permissions()
File "/opt/airflow/lib/python3.8/site-packages/airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", line 289, in remap_permissions
appbuilder = create_app(config={'FAB_UPDATE_PERMS': False}).appbuilder
File "/opt/airflow/lib/python3.8/site-packages/airflow/www/app.py", line 120, in create_app
init_api_connexion(flask_app)
File "/opt/airflow/lib/python3.8/site-packages/airflow/www/extensions/init_views.py", line 171, in init_api_connexion
api_bp = connexion_app.add_api(
File "/opt/airflow/lib/python3.8/site-packages/connexion/apps/flask_app.py", line 57, in add_api
api = super(FlaskApp, self).add_api(specification, **kwargs)
File "/opt/airflow/lib/python3.8/site-packages/connexion/apps/abstract.py", line 144, in add_api
api = self.api_cls(specification,
File "/opt/airflow/lib/python3.8/site-packages/connexion/apis/abstract.py", line 111, in __init__
self.add_paths()
File "/opt/airflow/lib/python3.8/site-packages/connexion/apis/abstract.py", line 216, in add_paths
self._handle_add_operation_error(path, method, err.exc_info)
File "/opt/airflow/lib/python3.8/site-packages/connexion/apis/abstract.py", line 231, in _handle_add_operation_error
raise value.with_traceback(traceback)
File "/opt/airflow/lib/python3.8/site-packages/connexion/resolver.py", line 61, in resolve_function_from_operation_id
return self.function_resolver(operation_id)
File "/opt/airflow/lib/python3.8/site-packages/connexion/utils.py", line 111, in get_function_from_name
module = importlib.import_module(module_name)
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/opt/airflow/lib/python3.8/site-packages/airflow/api_connexion/endpoints/connection_endpoint.py", line 26, in <module>
from airflow.api_connexion.schemas.connection_schema import (
File "/opt/airflow/lib/python3.8/site-packages/airflow/api_connexion/schemas/connection_schema.py", line 42, in <module>
class ConnectionSchema(ConnectionCollectionItemSchema): # pylint: disable=too-many-ancestors
File "/opt/airflow/lib/python3.8/site-packages/marshmallow/schema.py", line 121, in __new__
klass._declared_fields = mcs.get_declared_fields(
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 94, in get_declared_fields
fields.update(mcs.get_auto_fields(fields, converter, opts, dict_cls))
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 104, in get_auto_fields
{
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 105, in <dictcomp>
field_name: field.create_field(
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 28, in create_field
return converter.field_for(model, column_name, **self.field_kwargs)
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/convert.py", line 171, in field_for
return self.property2field(prop, **kwargs)
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/convert.py", line 146, in property2field
field_class = field_class or self._get_field_class_for_property(prop)
File "/opt/airflow/lib/python3.8/site-packages/marshmallow_sqlalchemy/convert.py", line 210, in _get_field_class_for_property
column = prop.columns[0]
File "/opt/airflow/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 1220, in __getattr__
return self._fallback_getattr(key)
File "/opt/airflow/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 1194, in _fallback_getattr
raise AttributeError(key)
AttributeError: columns
I'm having the same issue, temp fix was to use sqlalchemy < 1.4. Maybe that works for you.
What Worked for me is setting AIRFLOW_HOME and AIRFLOW__CORE__SQL_ALCHEMY_CONN and pip install SQLAlchemy==1.3.24
For me and Airflow 2.3.1 upgrading marshmallow-sqlalchemy helped.
pip install -U marshmallow-sqlalchemy
I set the AIRFLOW_HOME and AIRFLOW__CORE__SQL_ALCHEMY_CONN environment before run airflow db init.
Playbook version
environment:
AIRFLOW_HOME: "{{ airflow_app_home }}"
AIRFLOW__CORE__SQL_ALCHEMY_CONN: "{{ airflow_database_conn }}"
Am trying to execute Jupyter notebook from command line. Am currently using the docker image provided in below link
https://hub.docker.com/r/jupyter/pyspark-notebook
When i try to execute below in command line it fails.
jupyter nbconvert --to notebook --ExecutePreprocessor.kernel_name=PySpark --ExecutePreprocessor.timeout=3600 --execute notebooks/sample-notebook.ipynb
Below is the error message
[NbConvertApp] Converting notebook notebooks/sample-notebook.ipynb to notebook
Traceback (most recent call last):
File "/opt/conda/bin/jupyter-nbconvert", line 11, in <module>
sys.exit(main())
File "/opt/conda/lib/python3.7/site-packages/jupyter_core/application.py", line 270, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "/opt/conda/lib/python3.7/site-packages/traitlets/config/application.py", line 664, in launch_instance
app.start()
File "/opt/conda/lib/python3.7/site-packages/nbconvert/nbconvertapp.py", line 340, in start
self.convert_notebooks()
File "/opt/conda/lib/python3.7/site-packages/nbconvert/nbconvertapp.py", line 510, in convert_notebooks
self.convert_single_notebook(notebook_filename)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/nbconvertapp.py", line 481, in convert_single_notebook
output, resources = self.export_single_notebook(notebook_filename, resources, input_buffer=input_buffer)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/nbconvertapp.py", line 410, in export_single_notebook
output, resources = self.exporter.from_filename(notebook_filename, resources=resources)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/exporters/exporter.py", line 179, in from_filename
return self.from_file(f, resources=resources, **kw)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/exporters/exporter.py", line 197, in from_file
return self.from_notebook_node(nbformat.read(file_stream, as_version=4), resources=resources, **kw)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/exporters/notebook.py", line 32, in from_notebook_node
nb_copy, resources = super(NotebookExporter, self).from_notebook_node(nb, resources, **kw)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/exporters/exporter.py", line 139, in from_notebook_node
nb_copy, resources = self._preprocess(nb_copy, resources)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/exporters/exporter.py", line 316, in _preprocess
nbc, resc = preprocessor(nbc, resc)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/preprocessors/base.py", line 47, in __call__
return self.preprocess(nb, resources)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/preprocessors/execute.py", line 403, in preprocess
with self.setup_preprocessor(nb, resources, km=km):
File "/opt/conda/lib/python3.7/contextlib.py", line 112, in __enter__
return next(self.gen)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/preprocessors/execute.py", line 345, in setup_preprocessor
self.km, self.kc = self.start_new_kernel(**kwargs)
File "/opt/conda/lib/python3.7/site-packages/nbconvert/preprocessors/execute.py", line 291, in start_new_kernel
km.start_kernel(extra_arguments=self.extra_arguments, **kwargs)
File "/opt/conda/lib/python3.7/site-packages/jupyter_client/manager.py", line 301, in start_kernel
kernel_cmd, kw = self.pre_start_kernel(**kw)
File "/opt/conda/lib/python3.7/site-packages/jupyter_client/manager.py", line 254, in pre_start_kernel
kernel_cmd = self.format_kernel_cmd(extra_arguments=extra_arguments)
File "/opt/conda/lib/python3.7/site-packages/jupyter_client/manager.py", line 178, in format_kernel_cmd
cmd = self.kernel_spec.argv + extra_arguments
File "/opt/conda/lib/python3.7/site-packages/jupyter_client/manager.py", line 84, in kernel_spec
self._kernel_spec = self.kernel_spec_manager.get_kernel_spec(self.kernel_name)
File "/opt/conda/lib/python3.7/site-packages/jupyter_client/kernelspec.py", line 235, in get_kernel_spec
raise NoSuchKernel(kernel_name)
jupyter_client.kernelspec.NoSuchKernel: No such kernel named PySpark
Currently using allure-pytest-adaptor 1.7.8, pytest 3.2.1 and pytest-xdist 1.20.0
Having the issue when I use the xdist to run tests in parallel, if I ran tests all in serial, no such issue:
If there is only 1 or no failures, allure report is able to get generated
When there are more than one failures in a test run, allure report cannot be generated, stack trace and error messages:
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/bin/py.test", line 11, in <module>
sys.exit(main())
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/config.py", line 58, in main
return config.hook.pytest_cmdline_main(config=config)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
_MultiCall(methods, kwargs, hook.spec_opts).execute()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
res = hook_impl.function(*args)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/main.py", line 139, in pytest_cmdline_main
return wrap_session(config, _main)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/main.py", line 133, in wrap_session
exitstatus=session.exitstatus)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
_MultiCall(methods, kwargs, hook.spec_opts).execute()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
return _wrapped_call(hook_impl.function(*args), self.execute)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 250, in _wrapped_call
wrap_controller.send(call_outcome)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/terminal.py", line 406, in pytest_sessionfinish
outcome.get_result()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
raise ex[1].with_traceback(ex[2])
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 265, in __init__
self.result = func()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
res = hook_impl.function(*args)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/pytest_plugin.py", line 494, in pytest_sessionfinish
self.impl._write_xml(f, s)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/common.py", line 254, in _write_xml
xmlfied.toxml(),
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 129, in toxml
manys = sum([[(m[0], v) for v in m[1]] for m in entries(Many)], [])
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 123, in entries
for (name, rule) in items
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 124, in <listcomp>
if isinstance(rule, clazz) and rule.check(getattr(self, name))]
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 109, in value
values = super(WrappedMany, self).value(name, what)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 103, in value
return [self.rule.value(name, x) for x in what]
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 103, in <listcomp>
return [self.rule.value(name, x) for x in what]
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 92, in value
return what.toxml()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 128, in toxml
nested = entries(Nested)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 123, in entries
for (name, rule) in items
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 124, in <listcomp>
if isinstance(rule, clazz) and rule.check(getattr(self, name))]
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 92, in value
return what.toxml()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 126, in toxml
elements = entries(Element)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 123, in entries
for (name, rule) in items
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 124, in <listcomp>
if isinstance(rule, clazz) and rule.check(getattr(self, name))]
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/rules.py", line 80, in value
return element_maker(self.name or name, self.namespace)(legalize_xml(unicodify(what)))
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/allure/utils.py", line 126, in unicodify
return text_type(something) # #UndefinedVariable
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/_code/code.py", line 694, in __str__
s = self.__unicode__()
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/_code/code.py", line 704, in __unicode__
self.toterminal(tw)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/_code/code.py", line 735, in toterminal
element[0].toterminal(tw)
File "/data/jenkins/workspace/Pilot_WebUI_Python_Functional_Test/automated-tests/.tox/jenkins-webui/lib/python3.5/site-packages/_pytest/_code/code.py", line 764, in toterminal
if entry.style == "long":
AttributeError: 'dict' object has no attribute 'style'
I was running into the same issue using:
Python 2.7.10
allure-pytest-adaptor 1.7.9
pytest 3.0.0
pytest-xdist 1.20.0
I found this link but am not able to upgrade the plugins and my Python version at present:
https://github.com/pytest-dev/pytest/issues/2811
If the above works for you great but if you find yourself in need of a patch to the problem you are having I have one that may be of use.
The problem (I think) is the way xdist failures interact with pytest and the pytest-allure-adaptor plugin. Combined they send failure output as a dict and not a class object, which the pytest failure ReprTraceback class uses to work with failure output data.
This can be solved by storing test report attributes in an object using the __dict__ object. It turns out that the author already partially addressed this. I simply updated the existing method within the allure pytest_plugin.py to expose all necessary report data then replaced the intended report finalizer conditionals with the updated method. I have a patch here from a forked version of the plugin:
https://github.com/weeksghost/allure-python/tree/parallel-test-results
I'm sure there are better ways to handle this but it looks like this version of the allure project is no longer supported in favor of an updated version which I feel is a pretty awesome upgrade. I will be upgrading but at my own pace hence the hack.
I hope this is helpful.