Hi!

on 2019-05-16 13:00 UTC there will be a maintenance operation in one of the
Wikimedia Foundation datacenter racks that affects 2 of our servers running
virtual machines [0]. There is a risk that this maintenance operation can result
in power loss of the servers, affecting the virtual machines running on it.
However, there is no way to know for sure if there will be any outage at all.

If you are an admin of any of the VMs in the list and you want the VM to be
reallocated into other servers previous to the operation, please get in touch
with us as soon as possible. Remember that, right now, reallocating the VM to
other server means shutting down the VM briefly.

Here is a list of affected virtual machines:

cloudvirt1028.eqiad.wmnet:
    af-puppetdb01.automation-framework.eqiad.wmflabs
    bastion-eqiad1-02.bastion.eqiad.wmflabs
    fridolin.catgraph.eqiad.wmflabs
    cloud-puppetmaster-02.cloudinfra.eqiad.wmflabs
    cloudstore-dev-01.cloudstore.eqiad.wmflabs
    commtech-nsfw.commtech.eqiad.wmflabs
    clm-test-01.community-labs-monitoring.eqiad.wmflabs
    cyberbot-exec-iabot-01.cyberbot.eqiad.wmflabs
    deployment-db05.deployment-prep.eqiad.wmflabs
    deployment-memc05.deployment-prep.eqiad.wmflabs
    deployment-sca01.deployment-prep.eqiad.wmflabs
    deployment-pdfrender02.deployment-prep.eqiad.wmflabs
    ign.ign2commons.eqiad.wmflabs
    integration-slave-docker-1050.integration.eqiad.wmflabs
    integration-castor03.integration.eqiad.wmflabs
    api.openocr.eqiad.wmflabs
    osmit-umap.osmit.eqiad.wmflabs
    builder-envoy.packaging.eqiad.wmflabs
    jmm-buster.puppet.eqiad.wmflabs
    a11y.reading-web-staging.eqiad.wmflabs
    adhoc-utils01.security-tools.eqiad.wmflabs
    util-abogott-stretch.testlabs.eqiad.wmflabs
    canary1028-01.testlabs.eqiad.wmflabs
    stretch.thumbor.eqiad.wmflabs
    tools-worker-1023.tools.eqiad.wmflabs
    tools-proxy-04.tools.eqiad.wmflabs
    tools-docker-builder-06.tools.eqiad.wmflabs
    tools-sgewebgrid-generic-0904.tools.eqiad.wmflabs
    tools-sgeexec-0942.tools.eqiad.wmflabs
    tools-sgeexec-0941.tools.eqiad.wmflabs
    tools-sgeexec-0940.tools.eqiad.wmflabs
    tools-sgeexec-0939.tools.eqiad.wmflabs
    tools-sgeexec-0937.tools.eqiad.wmflabs
    tools-sgeexec-0929.tools.eqiad.wmflabs
    tools-sgeexec-0921.tools.eqiad.wmflabs
    tools-sgeexec-0920.tools.eqiad.wmflabs
    tools-sgeexec-0911.tools.eqiad.wmflabs
    tools-sgeexec-0909.tools.eqiad.wmflabs
    toolsbeta-proxy-01.toolsbeta.eqiad.wmflabs
    vconverter-instance.videowiki.eqiad.wmflabs
    perfbot.webperf.eqiad.wmflabs
    wdhqs-1.wikidata-history-query-service.eqiad.wmflabs

cloudvirt1014.eqiad.wmnet:
    commonsarchive-prod.commonsarchive.eqiad.wmflabs
    deployment-imagescaler03.deployment-prep.eqiad.wmflabs
    dumps-5.dumps.eqiad.wmflabs
    dumps-4.dumps.eqiad.wmflabs
    incubator-mw.incubator.eqiad.wmflabs
    webperformance.integration.eqiad.wmflabs
    saucelabs-01.integration.eqiad.wmflabs
    integration-puppetmaster01.integration.eqiad.wmflabs
    maps-puppetmaster.maps.eqiad.wmflabs
    maps-wma.maps.eqiad.wmflabs
    mwoffliner3.mwoffliner.eqiad.wmflabs
    mwoffliner1.mwoffliner.eqiad.wmflabs
    phlogiston-5.phlogiston.eqiad.wmflabs
    discovery-testing-01.shiny-r.eqiad.wmflabs
    snuggle-enwiki-01.snuggle.eqiad.wmflabs
    canary-1014-01.testlabs.eqiad.wmflabs
    tools-sgeexec-0901.tools.eqiad.wmflabs
    wdqs-test.wikidata-query.eqiad.wmflabs


Toolforge won't be affected by this operation.
You can read more details about the datacenter operation itself in phabricator 
[1].

Sorry for the short notice,

regards.

[0] Cloud Services: reallocate workload from rack B5-eqiad
https://phabricator.wikimedia.org/T223148
[1] Install new PDUs into b5-eqiad https://phabricator.wikimedia.org/T223126
-- 
Arturo Borrero Gonzalez
Operations Engineer / Wikimedia Cloud Services
Wikimedia Foundation

_______________________________________________
Wikimedia Cloud Services mailing list
Cloud@lists.wikimedia.org (formerly lab...@lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/cloud

Reply via email to