[Yahoo-eng-team] [Bug 1988297] Re: Unique constraint on external_id still created for table access_rule

2024-04-03 Thread OpenStack Infra
Reviewed:  https://review.opendev.org/c/openstack/keystone/+/885463
Committed: 
https://opendev.org/openstack/keystone/commit/90dcff07c03ee60227b01f47d67fe9e5b1629593
Submitter: "Zuul (22348)"
Branch:master

commit 90dcff07c03ee60227b01f47d67fe9e5b1629593
Author: Christian Rohmann 
Date:   Wed Jun 7 14:49:35 2023 +0200

sql: Fixup for invalid unique constraint on external_id in access_rule table

There was a big drop of invalid constraints with [1]. One of them was on
`external_id` in the access_rule table.

While the change made it into a Alembic revision with [2], it still exists 
in
the schema causing an a new Alembic autogeneration to actually add it again 
as
a revision.

[1] https://review.opendev.org/c/openstack/keystone/+/851845
[2] 
https://opendev.org/openstack/keystone/commit/7d169870fe418b9aa5765dc2f413ecdf9c8f1d48#diff-26484e3f6683ce7557e17b67220003784ff84fbe

Closes-Bug: #1988297
Change-Id: I66626ba8771ef2aa8b3580fd3f5d15fd4b58ab48


** Changed in: keystone
   Status: In Progress => Fix Released

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Identity (keystone).
https://bugs.launchpad.net/bugs/1988297

Title:
  Unique constraint on external_id still created for table access_rule

Status in OpenStack Identity (keystone):
  Fix Released

Bug description:
  Currently the primary key and an additional unique index are configured on 
the same column.
  This is why sqlalchemy logs a warning on a database migration displaying 
following information:

  ```
  ​/usr/lib/python3/dist-packages/pymysql/cursors.py:170: Warning: (1831, 
'Duplicate index `uniq_instances0uuid`. This is deprecated and will be 
disallowed in a future release')
  result = self._query(query)
  ```
  (​This example is actually taken from the nova output, but looks just the 
same for Keystone.
  There actually is the same issue within Nova schemas, see bug 
https://bugs.launchpad.net/nova/+bug/1641185)

  From my understanding of the documentation of mysql (see [1] [2]) and
  postgres (see [3] [4]) a unique constraint, which is created in the
  first place, automatically creates an index for the column(s). So
  there should be no need to create an additional index for the same
  column:

  ```
  Table: access_rule 
(https://opendev.org/openstack/keystone/src/commit/7c2d0f589c8daf5c65a80ed20d1e7fbfcc282312/keystone/common/sql/migrations/versions/27e647c0fad4_initial_version.py#L120)

  Column: external_id
  Indexes:
  Unique Constraint: access_rule_external_id_key
  Index: external_id
  ```


  [1] 
https://dev.mysql.com/doc/refman/8.0/en/create-index.html#create-index-unique
  [2] https://dev.mysql.com/doc/refman/8.0/en/mysql-indexes.html
  [3] 
https://www.postgresql.org/docs/current/ddl-constraints.html#DDL-CONSTRAINTS-UNIQUE-CONSTRAINTS
  [4] https://www.postgresql.org/docs/current/indexes-types.html

To manage notifications about this bug go to:
https://bugs.launchpad.net/keystone/+bug/1988297/+subscriptions


-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 2059962] Re: Unable to Live Migrate Instance. Horizon doesn't show error, logs do

2024-04-03 Thread Dave West
This is not a big, found solution.

** Changed in: nova
   Status: New => Invalid

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/2059962

Title:
  Unable to Live Migrate Instance.  Horizon doesn't show error, logs do

Status in OpenStack Compute (nova):
  Invalid

Bug description:
  When trying to live migrate an instance between hosts, I get the error
  below.   I can migrate an instance that is shutdown.

  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.665 27905 DEBUG nova.compute.manager [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] live_migration data is 
LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=63488,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpe_9d4avw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=10.136.149.5,image_type='default',instance_relative_path='b44705f4-6813-42ad-a4a8-a64255e2a6b7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(7e5bb92b-8d84-4320-b379-ee7b14fed53d),old_vol_attachment_ids={375dff72-24d4-475c-a172-33e356765410='1432f021-67a6-4b51-9988-fcad5a6e4684'},serial_listen_addr=None,serial_listen_ports=[],src_supports
 
_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr='10.136.149.5',vifs=[VIFMigrateData],wait_for_vif_plugged=True)
 _do_live_migration 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/compute/manager.py:8988
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.668 27905 DEBUG nova.objects.instance [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] Lazy-loading 
'migration_context' on Instance uuid b44705f4-6813-42ad-a4a8-a64255e2a6b7 
obj_load_attr 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/objects/instance.py:1152
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.668 27905 DEBUG nova.virt.libvirt.driver [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] [instance: 
b44705f4-6813-42ad-a4a8-a64255e2a6b7] Starting monitoring of live migration 
_live_migration 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/virt/libvirt/driver.py:10641
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.671 27905 DEBUG nova.virt.libvirt.driver [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] [instance: 
b44705f4-6813-42ad-a4a8-a64255e2a6b7] Operation thread is still running 
_live_migration_monitor 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/virt/libvirt/driver.py:10442
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.671 27905 DEBUG nova.virt.libvirt.driver [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] [instance: 
b44705f4-6813-42ad-a4a8-a64255e2a6b7] Migration not running yet 
_live_migration_monitor 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/virt/libvirt/driver.py:10451
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.681 27905 DEBUG nova.virt.libvirt.migration [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] Find same serial number: 
pos=0, serial=375dff72-24d4-475c-a172-33e356765410 _update_volume_xml 
/openstack/venvs/nova-28.1.1.dev4/lib64/python3.9/site-packages/nova/virt/libvirt/migration.py:242
  Apr  1 11:50:56 osp-compute-c02-01 nova-compute[27905]: 2024-04-01 
11:50:56.682 27905 DEBUG nova.virt.libvirt.vif [None 
req-e8e1cbc3-257a-48e8-9181-b940fa152cac dea9b2ecc4644988aec805e735f03de3 
18321e0d96a74a4ab1ec27394166fb66 - - default default] vif_type=ovs 
instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,compute_id=15,config_drive='',created_at=2024-04-01T16:44:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='test',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='osp-compute-c02-01.tsa402.service-now.com',hostname='test',id=1,image_ref='',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None

[Yahoo-eng-team] [Bug 2060163] [NEW] [ovn] race condition with add/remove router interface

2024-04-03 Thread Mohammed Naser
Public bug reported:

We're running into an issue in our CI with Atmosphere where we
frequently see failures when a router port is removed from an interface,
the traceback is the following:

==
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource [None 
req-e5d08cdd-28e6-4231-a50c-7eafc1b8f942 70fc3b55af8c4386b80207dad11db5da 
dcec54844db44eedbd9667951a5ceb6b - - - -] remove_router_interface failed: No 
details.: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find 
Logical_Router_Port with name=lrp-7e0debbb-893c-420a-8569-d8fb98e6a16e
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource Traceback (most recent 
call last):
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron/api/v2/resource.py", 
line 98, in resource
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource result = 
method(request=request, **args)
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron_lib/db/api.py", line 
140, in wrapped
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource with 
excutils.save_and_reraise_exception():
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
227, in __exit__
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource 
self.force_reraise()
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
200, in force_reraise
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource raise self.value
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron_lib/db/api.py", line 
138, in wrapped
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource return f(*args, 
**kwargs)
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_db/api.py", line 144, in 
wrapper
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource with 
excutils.save_and_reraise_exception() as ectxt:
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
227, in __exit__
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource 
self.force_reraise()
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
200, in force_reraise
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource raise self.value
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_db/api.py", line 142, in 
wrapper
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource return f(*args, 
**kwargs)
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron_lib/db/api.py", line 
186, in wrapped
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource with 
excutils.save_and_reraise_exception():
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
227, in __exit__
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource 
self.force_reraise()
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
200, in force_reraise
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource raise self.value
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron_lib/db/api.py", line 
184, in wrapped
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource return 
f(*dup_args, **dup_kwargs)
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron/api/v2/base.py", line 
253, in _handle_action
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource ret_value = 
getattr(self._plugin, name)(*arg_list, **kwargs)
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/neutron/services/ovn_l3/plugin.py",
 line 260, in remove_router_interface
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource with 
excutils.save_and_reraise_exception():
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
227, in __exit__
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource 
self.force_reraise()
2024-04-03 21:13:09.804 10 ERROR neutron.api.v2.resource   File 
"/var/lib/openstack/lib/python3.10/site-packages/oslo_utils/excutils.py", line 
200, in force_reraise
2024-04-03 21:13:09.804 10 ERROR neutron.api.v