For reference this is the output I see.

$ time env AIRFLOW__CORE__LOGGING_LEVEL=DEBUG 
AIRFLOW__CORE__DAGS_FOLDER="$PWD/../tests/dags" 
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql:///airflow airflow resetdb
[2019-10-27 16:36:45,778] {settings.py:211} DEBUG - Setting up DB connection 
pool (PID 42791)
[2019-10-27 16:36:45,778] {settings.py:252} INFO - settings.configure_orm(): 
Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=42791
/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/psycopg2/__init__.py:144:
 UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in 
order to keep installing from binary please use "pip install psycopg2-binary" 
instead. For details see: 
<http://initd.org/psycopg/docs/install.html#binary-install-from-pypi>.
  """)
[2019-10-27 16:36:45,950] {cli_action_loggers.py:41} DEBUG - Adding <function 
default_action_log at 0x106a93048> to pre execution callback
DB: postgresql:///airflow
This will drop existing tables if they exist. Proceed? (y/n)y
[2019-10-27 16:36:47,331] {db.py:389} INFO - Dropping tables that exist
[2019-10-27 16:36:47,591] {migration.py:130} INFO - Context impl PostgresqlImpl.
[2019-10-27 16:36:47,591] {migration.py:137} INFO - Will assume transactional 
DDL.
[2019-10-27 16:36:47,700] {db.py:368} INFO - Creating tables
INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> e3a246e0dc1, current 
schema
INFO  [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, 
create is_encrypted
INFO  [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 13eb55f81627, 
maintain history for compatibility with earlier migrations
INFO  [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 338e90f54d61, 
More logging into task_instance
INFO  [alembic.runtime.migration] Running upgrade 338e90f54d61 -> 52d714495f0, 
job_id indices
INFO  [alembic.runtime.migration] Running upgrade 52d714495f0 -> 502898887f84, 
Adding extra to Log
INFO  [alembic.runtime.migration] Running upgrade 502898887f84 -> 1b38cef5b76e, 
add dagrun
INFO  [alembic.runtime.migration] Running upgrade 1b38cef5b76e -> 2e541a1dcfed, 
task_duration
INFO  [alembic.runtime.migration] Running upgrade 2e541a1dcfed -> 40e67319e3a9, 
dagrun_config
INFO  [alembic.runtime.migration] Running upgrade 40e67319e3a9 -> 561833c1c74b, 
add password column to user
INFO  [alembic.runtime.migration] Running upgrade 561833c1c74b -> 4446e08588, 
dagrun start end
INFO  [alembic.runtime.migration] Running upgrade 4446e08588 -> bbc73705a13e, 
Add notification_sent column to sla_miss
INFO  [alembic.runtime.migration] Running upgrade bbc73705a13e -> bba5a7cfc896, 
Add a column to track the encryption state of the 'Extra' field in connection
INFO  [alembic.runtime.migration] Running upgrade bba5a7cfc896 -> 1968acfc09e3, 
add is_encrypted column to variable table
INFO  [alembic.runtime.migration] Running upgrade 1968acfc09e3 -> 2e82aab8ef20, 
rename user table
INFO  [alembic.runtime.migration] Running upgrade 2e82aab8ef20 -> 211e584da130, 
add TI state index
INFO  [alembic.runtime.migration] Running upgrade 211e584da130 -> 64de9cddf6c9, 
add task fails journal table
INFO  [alembic.runtime.migration] Running upgrade 64de9cddf6c9 -> f2ca10b85618, 
add dag_stats table
INFO  [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 4addfa1236f1, 
Add fractional seconds to mysql tables
INFO  [alembic.runtime.migration] Running upgrade 4addfa1236f1 -> 8504051e801b, 
xcom dag task indices
INFO  [alembic.runtime.migration] Running upgrade 8504051e801b -> 5e7d17757c7a, 
add pid field to TaskInstance
INFO  [alembic.runtime.migration] Running upgrade 5e7d17757c7a -> 127d2bf2dfa7, 
Add dag_id/state index on dag_run table
INFO  [alembic.runtime.migration] Running upgrade 127d2bf2dfa7 -> cc1e65623dc7, 
add max tries column to task instance
WARNI [airflow.utils.log.logging_mixin.LoggingMixin] You have configured a 
result_backend of redis://localhost/1 #db+postgres:///airflow, it is highly 
recommended to use an alternative result_backend (i.e. a database).
ERROR [airflow.models.dagbag.DagBag] Failed to import: 
/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_clear_subdag.py
Traceback (most recent call last):
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
psycopg2.ProgrammingError: relation "slot_pool" does not exist
LINE 2: FROM slot_pool
             ^


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ash/code/python/incubator-airflow/airflow/models/dagbag.py", 
line 204, in process_file
    m = imp.load_source(mod_name, filepath)
  File 
"/Users/ash/.homebrew/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/imp.py",
 line 171, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 696, in _load
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File 
"/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_clear_subdag.py",
 line 71, in <module>
    daily_job = create_subdag_opt(main_dag=dag)
  File 
"/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_clear_subdag.py",
 line 44, in create_subdag_opt
    dag=main_dag,
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/db.py", line 74, 
in wrapper
    return func(*args, **kwargs)
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/decorators.py", 
line 98, in wrapper
    result = func(*args, **kwargs)
  File 
"/Users/ash/code/python/incubator-airflow/airflow/operators/subdag_operator.py",
 line 77, in __init__
    .filter(Pool.pool == self.pool)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3222, in first
    ret = list(self[0:1])
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3012, in __getitem__
    return list(res)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3324, in __iter__
    return self._execute_and_instances(context)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3349, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 988, in execute
    return meth(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/sql/elements.py",
 line 287, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1107, in _execute_clauseelement
    distilled_params,
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1248, in _execute_context
    e, statement, parameters, cursor, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1466, in _handle_dbapi_exception
    util.raise_from_cause(sqlalchemy_exception, exc_info)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 399, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 153, in reraise
    raise value.with_traceback(tb)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) relation 
"slot_pool" does not exist
LINE 2: FROM slot_pool
             ^

[SQL: SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, 
slot_pool.slots AS slot_pool_slots, slot_pool.description AS 
slot_pool_description
FROM slot_pool
WHERE slot_pool.slots = %(slots_1)s AND slot_pool.pool = %(pool_1)s
 LIMIT %(param_1)s]
[parameters: {'slots_1': 1, 'pool_1': 'default_pool', 'param_1': 1}]
(Background on this error at: http://sqlalche.me/e/f405)
ERROR [airflow.models.dagbag.DagBag] Failed to import: 
/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_issue_1225.py
Traceback (most recent call last):
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
psycopg2.ProgrammingError: relation "slot_pool" does not exist
LINE 2: FROM slot_pool
             ^


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ash/code/python/incubator-airflow/airflow/models/dagbag.py", 
line 204, in process_file
    m = imp.load_source(mod_name, filepath)
  File 
"/Users/ash/.homebrew/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/imp.py",
 line 171, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 696, in _load
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File 
"/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_issue_1225.py",
 line 125, in <module>
    subdag=subdag7)
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/db.py", line 74, 
in wrapper
    return func(*args, **kwargs)
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/decorators.py", 
line 98, in wrapper
    result = func(*args, **kwargs)
  File 
"/Users/ash/code/python/incubator-airflow/airflow/operators/subdag_operator.py",
 line 77, in __init__
    .filter(Pool.pool == self.pool)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3222, in first
    ret = list(self[0:1])
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3012, in __getitem__
    return list(res)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3324, in __iter__
    return self._execute_and_instances(context)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3349, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 988, in execute
    return meth(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/sql/elements.py",
 line 287, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1107, in _execute_clauseelement
    distilled_params,
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1248, in _execute_context
    e, statement, parameters, cursor, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1466, in _handle_dbapi_exception
    util.raise_from_cause(sqlalchemy_exception, exc_info)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 399, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 153, in reraise
    raise value.with_traceback(tb)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) relation 
"slot_pool" does not exist
LINE 2: FROM slot_pool
             ^

[SQL: SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, 
slot_pool.slots AS slot_pool_slots, slot_pool.description AS 
slot_pool_description
FROM slot_pool
WHERE slot_pool.slots = %(slots_1)s AND slot_pool.pool = %(pool_1)s
 LIMIT %(param_1)s]
[parameters: {'slots_1': 1, 'pool_1': 'default_pool', 'param_1': 1}]
(Background on this error at: http://sqlalche.me/e/f405)
ERROR [airflow.models.dagbag.DagBag] Failed to import: 
/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_impersonation_subdag.py
Traceback (most recent call last):
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
psycopg2.ProgrammingError: relation "slot_pool" does not exist
LINE 2: FROM slot_pool
             ^


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ash/code/python/incubator-airflow/airflow/models/dagbag.py", 
line 204, in process_file
    m = imp.load_source(mod_name, filepath)
  File 
"/Users/ash/.homebrew/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/imp.py",
 line 171, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 696, in _load
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File 
"/Users/ash/code/python/incubator-airflow/airflow_home/../tests/dags/test_impersonation_subdag.py",
 line 64, in <module>
    dag=dag)
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/db.py", line 74, 
in wrapper
    return func(*args, **kwargs)
  File "/Users/ash/code/python/incubator-airflow/airflow/utils/decorators.py", 
line 98, in wrapper
    result = func(*args, **kwargs)
  File 
"/Users/ash/code/python/incubator-airflow/airflow/operators/subdag_operator.py",
 line 77, in __init__
    .filter(Pool.pool == self.pool)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3222, in first
    ret = list(self[0:1])
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3012, in __getitem__
    return list(res)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3324, in __iter__
    return self._execute_and_instances(context)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
 line 3349, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 988, in execute
    return meth(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/sql/elements.py",
 line 287, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1107, in _execute_clauseelement
    distilled_params,
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1248, in _execute_context
    e, statement, parameters, cursor, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1466, in _handle_dbapi_exception
    util.raise_from_cause(sqlalchemy_exception, exc_info)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 399, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
 line 153, in reraise
    raise value.with_traceback(tb)
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
 line 1244, in _execute_context
    cursor, statement, parameters, context
  File 
"/Users/ash/.virtualenvs/airflow/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
 line 550, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) relation 
"slot_pool" does not exist
LINE 2: FROM slot_pool
             ^

[SQL: SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, 
slot_pool.slots AS slot_pool_slots, slot_pool.description AS 
slot_pool_description
FROM slot_pool
WHERE slot_pool.slots = %(slots_1)s AND slot_pool.pool = %(pool_1)s
 LIMIT %(param_1)s]
[parameters: {'slots_1': 1, 'pool_1': 'default_pool', 'param_1': 1}]
(Background on this error at: http://sqlalche.me/e/f405)
INFO  [alembic.runtime.migration] Running upgrade cc1e65623dc7 -> bdaa763e6c56, 
Make xcom value column a large binary
INFO  [alembic.runtime.migration] Running upgrade bdaa763e6c56 -> 947454bf1dff, 
add ti job_id index
INFO  [alembic.runtime.migration] Running upgrade 947454bf1dff -> d2ae31099d61, 
Increase text size for MySQL (not relevant for other DBs' text types)
INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 0e2a74e0fc9f, 
Add time zone awareness
INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 33ae817a1ff4, 
kubernetes_resource_checkpointing
INFO  [alembic.runtime.migration] Running upgrade 33ae817a1ff4 -> 27c6a30d7c24, 
kubernetes_resource_checkpointing
INFO  [alembic.runtime.migration] Running upgrade 27c6a30d7c24 -> 86770d1215c0, 
add kubernetes scheduler uniqueness
INFO  [alembic.runtime.migration] Running upgrade 86770d1215c0, 0e2a74e0fc9f -> 
05f30312d566, merge heads
INFO  [alembic.runtime.migration] Running upgrade 05f30312d566 -> f23433877c24, 
fix mysql not null constraint
INFO  [alembic.runtime.migration] Running upgrade f23433877c24 -> 856955da8476, 
fix sqlite foreign key
INFO  [alembic.runtime.migration] Running upgrade 856955da8476 -> 9635ae0956e7, 
index-faskfail
INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> dd25f486b8ea, 
add idx_log_dag
INFO  [alembic.runtime.migration] Running upgrade dd25f486b8ea -> bf00311e1990, 
add index to taskinstance
INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> 0a2a5b66e19d, 
add task_reschedule table
INFO  [alembic.runtime.migration] Running upgrade 0a2a5b66e19d, bf00311e1990 -> 
03bc53e68815, merge_heads_2
INFO  [alembic.runtime.migration] Running upgrade 03bc53e68815 -> 41f5f12752f8, 
add superuser field
INFO  [alembic.runtime.migration] Running upgrade 41f5f12752f8 -> c8ffec048a3b, 
add fields to dag
INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> dd4ecb8fbee3, 
Add schedule interval to dag
INFO  [alembic.runtime.migration] Running upgrade dd4ecb8fbee3 -> 939bb1e647c8, 
task reschedule fk on cascade delete
INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 6e96a59344a4, 
Make TaskInstance.pool not nullable
INFO  [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> 74effc47d867, 
change datetime to datetime2(6) on MSSQL tables
INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 004c1210f153, 
increase queue name size limit
INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> a56c9515abdc, 
Remove dag_stat table
env PYTHONPATH=../ AIRFLOW__CORE__LOGGING_LEVEL=DEBUG AIRFLOW__HOME=$PWD       
3.17s user 0.61s system 58% cpu 6.527 total

$ echo $?
0



> On 27 Oct 2019, at 16:38, Ash Berlin-Taylor <a...@apache.org> wrote:
> 
> Does it fail and terminate the resetdb/migration or just print an error to 
> the console? If it's the latter it's been doing that for ages. Yes it should 
> be fixed, but it has been doing that for all of 1.10 I think.
> 
> When I try it with the dag folder you suggest I see further migrations after 
> the error. Don't forget that DagBag loading handles errors, and, further 
> migrations run after the error is printed, and `echo $?` reports success.
> 
> -ash
> 
>> On 27 Oct 2019, at 16:33, Jarek Potiuk <jarek.pot...@polidea.com> wrote:
>> 
>> I also reconsider my +1 - also because of migrations but for a different
>> reason.
>> 
>> During my tests of cherry-picking latest kind changes to v1-10-test I have
>> just found another potential problem with migrations  - and looking at what
>> Fokko reported - we might want to consider reviewing and fixing the
>> migration scripts before 1.10.6.
>> 
>> The scenario is a bit involved, but it might cause problems for someone
>> resetting the DB in 1.10.6  with DAGs already in the "dags" folder. Not
>> sure when it was introduced (it has few components introducing the
>> problem), so it might be already in one of the earlier releases.
>> 
>> You can reproduce it when you run `airflow resetdb` in the current 1.10.6
>> (I tried with python 2.7 + Postgres) and you
>> have AIRFLOW__CORE__DAGS_FOLDER="S{AIRFLOW_SOURCES}/tests/dags" set (which
>> is set in my version of Breeze).
>> 
>> The resetdb operation fails in the middle with "cannot import
>> tests/dags/test_clear_subdag.py". I investigated it a bit and I found that
>> this is a problem with one of the migration scripts + subdag operator
>> relying on pool_slot relation being present..
>> 
>> The problem is in *cc1e65623dc7_add_max_tries_column_to_task_instance.py*.
>> The script has this piece of code:
>> 
>> if 'task_instance' in tables:
>>   # Get current session
>>   sessionmaker = sa.orm.sessionmaker()
>>   session = sessionmaker(bind=connection)
>>   dagbag = DagBag(settings.DAGS_FOLDER)
>> 
>> It means that it loads and parses all the dag files during the migration.
>> 
>> Then there is this piece of code in *subdag_operator*'s init():
>> 
>> # validate that subdag operator and subdag tasks don't have a
>> # pool conflict
>> if self.pool and self.pool != Pool.DEFAULT_POOL_NAME:
>>   conflicts = [t for t in subdag.tasks if t.pool == self.pool]
>>   if conflicts:
>>       # only query for pool conflicts if one may exist
>>       pool = (
>>           session
>>           .query(Pool)
>>           .filter(Pool.slots == 1)
>>           .filter(Pool.pool == self.pool)
>>           .first()
>>       )
>> 
>> This means that if self.pool is set, then Pool.slots query will be
>> executed. Unfortunately at this time, the slot_pool relation is not created
>> yet - because it is created in one of later migrations. Then the query
>> fails with:
>> 
>> 
>> 
>> *psycopg2.ProgrammingError: relation "slot_pool" does not existLINE 2: FROM
>> slot_pool*
>> 
>> Parsing the test dag fails now because the pool is set to 'default_pool' (I
>> guess it used to be None in one of the earlier versions - that's why it was
>> not caught before).
>> 
>> I am afraid that if someone has a DAG containing SubDag operator they will
>> have exactly this problem when running `airflow resetdb`.
>> 
>> I am not 100% if this is a blocker but it looks like very close to being
>> one. We could say to users to always remove all the files from airflow's
>> dag folder before running resetdb, but I am afraid this is not really nice.
>> 
>> J.
>> 
>> 
>> On Sun, Oct 27, 2019 at 4:48 PM Kaxil Naik <kaxiln...@gmail.com> wrote:
>> 
>>> As 2.0 is not released, if migration are broken, we can still fix them on
>>> the mater itself.
>>> 
>>> 
>>> 
>>> On Sun, Oct 27, 2019, 15:29 Driesprong, Fokko <fo...@driesprong.frl>
>>> wrote:
>>> 
>>>> The problem is that if you upgrade from the 1.10 branch to 2.0, the
>>>> migrations don't work. To probably the migrations in 1.10 and 2.0 have
>>>> diverged somewhere.
>>>> 
>>>> The following can be used to reproduce this:
>>>> git pull upstream master # ensure we have latest
>>>> git checkout v-10-stable # 1.10.6rc2
>>>> rm ~/airflow/airflow.db
>>>> airflow initdb
>>>> git checkout master
>>>> airflow db upgrade
>>>> 
>>>> 
>>>> 
>>>> Op zo 27 okt. 2019 om 15:53 schreef Ash Berlin-Taylor <a...@apache.org>:
>>>> 
>>>>> That PR you mentioned isn't yet in a release, so can we not just edit
>>>> that
>>>>> migration in place?
>>>>> 
>>>>> I'm not quite following (I didn't get much sleep) but I don't see how
>>>>> broken code on master affects this release?
>>>>> 
>>>>> -a
>>>>> 
>>>>>> On 27 Oct 2019, at 14:28, Driesprong, Fokko <fo...@driesprong.frl>
>>>>> wrote:
>>>>>> 
>>>>>> I'm having second thoughts on my +1 vote.
>>>>>> 
>>>>>> It turns out that the database migrations are broken from 1.10.6 to
>>>>> 2.0.0:
>>>>>> 
>>>>> 
>>>> 
>>> https://github.com/apache/airflow/pull/6370/files/c9b9312a60f891475d3072584171c2af56246829#r339287344
>>>>>> 
>>>>>> So we either need to release a 1.10.7 and force users to migrate to
>>>> that
>>>>>> version first, before going to 2.0.0, otherwise, it won't work. I've
>>>>> opened
>>>>>> up a PR to fix this for 2.0.0:
>>>>> https://github.com/apache/airflow/pull/6442
>>>>>> 
>>>>>> Cheers, Fokko
>>>>>> 
>>>>>> 
>>>>>> Op za 26 okt. 2019 om 09:23 schreef Jarek Potiuk <
>>>>> jarek.pot...@polidea.com>:
>>>>>> 
>>>>>>> +1 (binding)
>>>>>>> 
>>>>>>> Tested on Python 3.7/Python 2.7 using local installation + double
>>>>> checked
>>>>>>> sources with RAT licence tool (All good this time!).
>>>>>>> 
>>>>>>> Regarding why AIRFLOW-5746 (
>>>> https://github.com/apache/airflow/pull/6434
>>>>> )
>>>>>>> is
>>>>>>> being reverted (or rather being improved on). I don't think it's a
>>>>> blocker
>>>>>>> for this release.
>>>>>>> 
>>>>>>> It is just a test change - the tests how they were written, broke
>>> some
>>>>>>> development environments (the airflow resetdb was failing in case
>>>>>>> PYTHONPATH was not specially crafted). The current test does not
>>>> really
>>>>>>> test what it should (passing PYTHONPATH to impersonated tasks) but
>>> the
>>>>>>> functionality (impersonation + path) has not actually changed and it
>>>>> works.
>>>>>>> The test worked fine before the change was added - it's just imports
>>>> in
>>>>>>> tests that have been changed.
>>>>>>> 
>>>>>>> Kamil (thanks!) already submitted a fix for that, that solves it
>>>>>>> permanently https://github.com/apache/airflow/pull/6436/ - will
>>>> review
>>>>> and
>>>>>>> merge soon. But it's not a blocker IMHO.
>>>>>>> 
>>>>>>> J.
>>>>>>> 
>>>>>>> On Sat, Oct 26, 2019 at 2:10 AM Kaxil Naik <kaxiln...@gmail.com>
>>>> wrote:
>>>>>>> 
>>>>>>>> +1 (binding)
>>>>>>>> 
>>>>>>>> On Sat, Oct 26, 2019 at 12:05 AM Driesprong, Fokko
>>>>> <fo...@driesprong.frl
>>>>>>>> 
>>>>>>>> wrote:
>>>>>>>> 
>>>>>>>>> +1 binding from my side
>>>>>>>>> 
>>>>>>>>> Ran an example DAGs with Docker using Python 3.7.
>>>>>>>>> 
>>>>>>>>> We might need to check why AIRFLOW-5746 is being reverted:
>>>>>>>>> https://github.com/apache/airflow/pull/6434
>>>>>>>>> 
>>>>>>>>> If there is another RC, I'd like to request to cherry-pick
>>>>>>>>> https://github.com/apache/airflow/pull/6370 onto the 1.10 branch.
>>>>>>>>> 
>>>>>>>>> Cheers, Fokko
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> Op vr 25 okt. 2019 om 23:04 schreef Ash Berlin-Taylor <
>>>> a...@apache.org
>>>>>>>> :
>>>>>>>>> 
>>>>>>>>>> Hey all,
>>>>>>>>>> 
>>>>>>>>>> I have cut Airflow 1.10.6 RC2. This email is calling a vote on
>>> the
>>>>>>>>>> release, which will last for 72 hours, until Monday 28, October
>>>> 22nd
>>>>>>> at
>>>>>>>>>> 21:05 UTC.
>>>>>>>>>> 
>>>>>>>>>> Consider this my (binding) +1.
>>>>>>>>>> 
>>>>>>>>>> Airflow 1.10.6 RC1 is available at: <
>>>>>>>>>> https://dist.apache.org/repos/dist/dev/airflow/1.10.6rc2/>
>>>>>>>>>> 
>>>>>>>>>> *apache-airflow-1.10.6rc2-source.tar.gz* is a source release that
>>>>>>> comes
>>>>>>>>>> with INSTALL instructions.
>>>>>>>>>> *apache-airflow-1.10.6rc2-bin.tar.gz* is the binary Python
>>> "sdist"
>>>>>>>>> release.
>>>>>>>>>> *apache_airflow-1.10.6rc2-py2.py3-none-any.whl* is the binary
>>>> Python
>>>>>>>>>> "wheel" release.
>>>>>>>>>> 
>>>>>>>>>> Public keys are available at: <
>>>>>>>>>> https://dist.apache.org/repos/dist/release/airflow/KEYS>
>>>>>>>>>> 
>>>>>>>>>> As per normal the rc1 is available for testing from PyPi.
>>>>>>>>>> 
>>>>>>>>>> Only votes from PMC members are binding, but members of the
>>>> community
>>>>>>>> are
>>>>>>>>>> encouraged to test the release and vote with "(non-binding)".
>>>>>>>>>> 
>>>>>>>>>> Please note that the version number excludes the `rcX` string, so
>>>>>>> it's
>>>>>>>>> now
>>>>>>>>>> simply 1.10.6. This will allow us to rename the artifact without
>>>>>>>>> modifying
>>>>>>>>>> the artifact checksums when we actually release.
>>>>>>>>>> 
>>>>>>>>>> The changes since RC1 are to fix License issues, ensure tests are
>>>>>>>> running
>>>>>>>>>> on Py2 (they weren't, but the only py3 bits that crept in were in
>>>> the
>>>>>>>>> test
>>>>>>>>>> files luckily.).
>>>>>>>>>> 
>>>>>>>>>> Changelog since 1.10.6rc1:
>>>>>>>>>> 
>>>>>>>>>> * 73bf71835 [AIRFLOW-XXX] Update date in changelog [Ash
>>>>>>> Berlin-Taylor]
>>>>>>>>>> * 143b43151 [AIRFLOW-5750] Licence check is done also for
>>>>>>>> non-executable
>>>>>>>>>> .sh (#6425) [Jarek Potiuk]
>>>>>>>>>> * 544f2b336 [AIRFLOW-5754] Improved RAT checking (#6429) [Jarek
>>>>>>> Potiuk]
>>>>>>>>>> * 7904669ca [AIRFLOW-5755] Fixed most problems with py27 [Jarek
>>>>>>> Potiuk]
>>>>>>>>>> * d601752c4 [AIRFLOW-5748] Remove python auto-detection (#6423)
>>>>>>> [Jarek
>>>>>>>>>> Potiuk]
>>>>>>>>>> * 71e20417f [AIRFLOW-5746] Fix problems with static checks
>>> (#6420)
>>>>>>>> [Jarek
>>>>>>>>>> Potiuk]
>>>>>>>>>> * 7a6adad60 [AIRFLOW-5746] move FakeDateTime into the only place
>>> it
>>>>>>> is
>>>>>>>>>> used (#6416) [Michael R. Crusoe]
>>>>>>>>>> * e30fb85ca [AIRFLOW-5745] Breeze complete has now licence
>>> (#6415)
>>>>>>>> [Jarek
>>>>>>>>>> Potiuk]
>>>>>>>>>> 
>>>>>>>>>> Files changes since rc1:
>>>>>>>>>> 
>>>>>>>>>> airflow ❯ git diff --stat 1.10.6rc1...1.10.6rc2
>>>>>>>>>> .pre-commit-config.yaml                      |  1 -
>>>>>>>>>> .rat-excludes                                |  1 +
>>>>>>>>>> .travis.yml                                  | 47
>>>>>>>>>> +++++++++++++++++++++++++++++++++++++++--------
>>>>>>>>>> CHANGELOG.txt                                |  2 +-
>>>>>>>>>> Dockerfile-checklicence                      |  2 +-
>>>>>>>>>> airflow/models/dag.py                        |  5 ++++-
>>>>>>>>>> breeze                                       |  1 +
>>>>>>>>>> breeze-complete                              | 17
>>> +++++++++++++++++
>>>>>>>>>> common/_autodetect_variables.sh              | 49
>>>>>>>>>> +++++++++++++------------------------------------
>>>>>>>>>> files/x                                      |  0
>>>>>>>>>> files/y                                      |  0
>>>>>>>>>> scripts/ci/_utils.sh                         | 16
>>> ++++++++++++++--
>>>>>>>>>> scripts/ci/ci_check_license.sh               |  1 +
>>>>>>>>>> scripts/ci/ci_run_airflow_testing.sh         |  2 +-
>>>>>>>>>> scripts/ci/in_container/run_check_licence.sh |  2 +-
>>>>>>>>>> tests/contrib/hooks/test_ssh_hook.py         |  3 ++-
>>>>>>>>>> tests/dags/test_impersonation_custom.py      | 12 +++++++++++-
>>>>>>>>>> tests/models/test_baseoperator.py            |  9 ++++-----
>>>>>>>>>> tests/models/test_dag.py                     |  4 ++--
>>>>>>>>>> tests/test_sentry.py                         |  2 +-
>>>>>>>>>> tests/test_utils/fake_datetime.py            | 29
>>>>>>>>>> -----------------------------
>>>>>>>>>> tests/test_utils/system_tests_class.py       |  6 +++++-
>>>>>>>>>> tests/www/test_views.py                      |  6 +++---
>>>>>>>>>> 23 files changed, 122 insertions(+), 95 deletions(-)
>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> 
>>>>>>> Jarek Potiuk
>>>>>>> Polidea <https://www.polidea.com/> | Principal Software Engineer
>>>>>>> 
>>>>>>> M: +48 660 796 129 <+48660796129>
>>>>>>> [image: Polidea] <https://www.polidea.com/>
>>>>>>> 
>>>>> 
>>>>> 
>>>> 
>>> 
>> 
>> 
>> -- 
>> 
>> Jarek Potiuk
>> Polidea <https://www.polidea.com/> | Principal Software Engineer
>> 
>> M: +48 660 796 129 <+48660796129>
>> [image: Polidea] <https://www.polidea.com/>
> 

Reply via email to