Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Pau Garcia Quiles
    @paususe
    run supportconfig and attach the output
    supportconfig is in the supportutilspackage. If you are using one of my VMs, it's already installed.
    Ethan Bonick
    @EthanB11
    It was installed, running now
    I built the vm myself, didn't know there were prebuilt vm's
    I just built this up last week. We moved to SLES 15 a few months back and that's when I realized SLES 15 and spacewalk didn't work together anymore. It looks like you guys fixed the reposync issue and I opened a ticket with spacewalk to try and get the fix you guys did implemented. But it was never assigned so I had to find another solution and ended up finding uyuni last week.
    Pau Garcia Quiles
    @paususe
    that's great to hear
    we did a lot more stuff, in addition to fixing reposync ;-)
    Ethan Bonick
    @EthanB11
    opening ticket, trying to figure out how to attach logs.
    file is 28mb how can I get it to you guys?
    Ethan Bonick
    @EthanB11
    Also how do I create locally-managed file? I only see a link them (0 files) on the Configuration-Overview page. When I go into a system I don't see anything for locally-managed like spacewalk has or uyuni documentation shows.
    Pau Garcia Quiles
    @paususe

    file is 28mb how can I get it to you guys?

    just attach it to the github issue (drag and drop in the comment field)

    if github complains it's too big, upload it somewhere (dropbox, mega or whatever) and link from the issue
    Pau Garcia Quiles
    @paususe

    Also how do I create locally-managed file? I only see a link them (0 files) on the Configuration-Overview page. When I go into a system I don't see anything for locally-managed like spacewalk has or uyuni documentation shows.

    https://www.uyuni-project.org/uyuni-docs/uyuni/reference/configuration/config-channels.html#config-config-channels-channel-details

    do you mean you don't see that?
    Ethan Bonick
    @EthanB11
    I dont see any way to add a locally managed file when I am in a system. So Open a system and go to Configuration - Overview
    No way to add a locally managed file there
    Heading from Uyuni Configuration (Overview - View Files - Deploy Files - Compare Files- Manage Configuration Channels)
    Headings from Spacewalk 2.9 Configuration (Overview - View/Modify Files - Add Files - Deploy Files - Compare Files - Manage Configuration Channels)
    On a machine under Configuration - Overview there is no way to add a locally managed file there, like on there is on spacewalk.
    Pau Garcia Quiles
    @paususe
    mmm let me check
    Ethan Bonick
    @EthanB11
    System Sandbox configuration is also missing.
    Ethan Bonick
    @EthanB11
    Also digging into versioning. zypper info uyuni_server returns version 2020.01-7.1.uyuni while the website shows 2020.03 bottom left. Why don't the version numbers match up?
    Ethan Bonick
    @EthanB11
    A little more digging and if I take the url from my spacewalk server to add a locally deployed file I can get to the webpage on Uyuni and create the locally-managed file and deploy it. Just looks like some links are missing for locally-managed files and sandbox files.
    Pau Garcia Quiles
    @paususe

    Also digging into versioning. zypper info uyuni_server returns version 2020.01-7.1.uyuni while the website shows 2020.03 bottom left. Why don't the version numbers match up?

    @juliogonzalez please take a look at that. It's true zypper info patterns-uyuni_server returns 2020.01-7.1.uyuni. We may need some adjustment to our release scripts
    @EthanB11 this is only the second release with this YYYY.MM versioning schema, it seems we have something hardcoded somewhere

    Ethan Bonick
    @EthanB11
    @paususe I entered a issue on github for the locally-managed files issue. uyuni-project/uyuni#2060
    Julio González Gil
    @juliogonzalez
    there are no release scripts to adjust, that's adjusted manually and sadly I didn't do it this time
    however since master is again open, I can't really fix it now, so it needs to stay that way
    of course I could prepare a script to start handling that, but I will need some time, as the version needs to be changed right now for the doc (OBS), pattern (OBS), release notes (OBS) and spacewalk-web (Git)
    so maybe first step would be a script to check that those are aligned, before I release anything
    we'll see
    Pau Garcia Quiles
    @paususe
    @juliogonzalez please create a GH card for that
    Pau Garcia Quiles
    @paususe

    @paususe I entered a issue on github for the locally-managed files issue. uyuni-project/uyuni#2060

    hey, sorry for the late reply: I think the problem is that's a Salt minion. Local config files work on traditional stack clients (system type = Management). For Salt clients (system type = Salt), you need to use state channels. Centrally-managed configuration channels work on both traditional and Salt clients. You can create a state channel from the main menu > Configuration > Channels > Create State Channel.

    Pau Garcia Quiles
    @paususe

    Headings from Spacewalk 2.9 Configuration (Overview - View/Modify Files - Add Files - Deploy Files - Compare Files - Manage Configuration Channels)

    if you try that on Uyuni with a traditional client, you'll see Overview - View/Modify Files - Add Files - Manage Configuration Channels. You'll notice for traditional clients, there is no Formulas tab on top. Also, no idea why we don't show "Compare Files" and "Deploy Files", @renner do you know?

    Ethan Bonick
    @EthanB11
    @paususe I updated the github issue, but have other questions. It seems that the basic feature set of a locally managed file has gone away. Salt states can be applied to multiple machines, where as locally-managed files can not be. Is this just something in the way salt itself works, so Uyuni has to work that way also?
    Ethan Bonick
    @EthanB11
    I'm running into issue with the centos8 packages. * going to install missing packages...
    susemanager:bootstrap 212 kB/s | 2.6 kB 00:00
    Error:
    Problem 1: package python3-salt-2019.2.3-17.2.uyuni.x86_64 requires python3-tornado >= 4.2.1, but none of the providers can be installed
    • package python3-salt-2019.2.3-17.2.uyuni.x86_64 conflicts with python3-tornado >= 5 provided by python3-tornado-6.0.2-1.el8.x86_64
    • package salt-2019.2.3-17.2.uyuni.x86_64 requires python3-salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
    • conflicting requests
      Problem 2: package python3-salt-2019.2.3-17.2.uyuni.x86_64 requires python3-tornado >= 4.2.1, but none of the providers can be installed
    • package python3-salt-2019.2.3-17.2.uyuni.x86_64 conflicts with python3-tornado >= 5 provided by python3-tornado-6.0.2-1.el8.x86_64
    • package salt-2019.2.3-17.2.uyuni.x86_64 requires python3-salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
    • package salt-minion-2019.2.3-17.2.uyuni.x86_64 requires salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
    • conflicting requests
      (try to add '--allowerasing' to command line to replace conflicting packages or '--skip-broken' to skip uninstallable packages or '--nobest' to use not only best candidate packages)
      package salt is not installed
      ERROR: Failed to install all missing packages.
    I tried manually installing those packages from the uyuni client repo
    python3-tornado-6.0.2-1.el8.x86_64.rpm 28 MB/s | 722 kB 00:00
    python3-salt-2019.2.3-17.2.uyuni.x86_64.rpm 156 MB/s | 9.6 MB 00:00
    Error:
    Problem: package python3-salt-2019.2.3-17.2.uyuni.x86_64 conflicts with python3-tornado >= 5 provided by python3-tornado-6.0.2-1.el8.x86_64
    • conflicting requests
      (try to add '--allowerasing' to command line to replace conflicting packages or '--skip-broken' to skip uninstallable packages or '--nobest' to use not only best candidate packages)
    Pau Garcia Quiles
    @paususe
    @brejoc ^^^
    @EthanB11 please disable EPEL and re-try
    Ethan Bonick
    @EthanB11
    EPEL was disabled. The manual install was just a yum install using the rpms from my bootstrap repo.
    Pau Garcia Quiles
    @paususe
    python3-tornado-6.0.2 comes from EPEL, doesn't it?
    Ethan Bonick
    @EthanB11
    Here is epel disabled python3-tornado removed. I have a copr spacewalk client channel too I removed that was still hanging around
    • going to install missing packages...
      CentOS-8 - AppStream 2.5 MB/s | 6.6 MB 00:02
      CentOS-8 - Base 9.5 MB/s | 5.0 MB 00:00
      CentOS-8 - Extras 15 kB/s | 4.8 kB 00:00
      susemanager:bootstrap 1.5 MB/s | 47 kB 00:00
      Error:
      Problem 1: package python3-salt-2019.2.3-17.2.uyuni.x86_64 requires python3-tornado >= 4.2.1, but none of the providers can be installed
      • package python3-salt-2019.2.3-17.2.uyuni.x86_64 conflicts with python3-tornado >= 5 provided by python3-tornado-6.0.2-1.el8.x86_64
      • package salt-2019.2.3-17.2.uyuni.x86_64 requires python3-salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
      • conflicting requests
        Problem 2: package python3-salt-2019.2.3-17.2.uyuni.x86_64 requires python3-tornado >= 4.2.1, but none of the providers can be installed
      • package python3-salt-2019.2.3-17.2.uyuni.x86_64 conflicts with python3-tornado >= 5 provided by python3-tornado-6.0.2-1.el8.x86_64
      • package salt-2019.2.3-17.2.uyuni.x86_64 requires python3-salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
      • package salt-minion-2019.2.3-17.2.uyuni.x86_64 requires salt = 2019.2.3-17.2.uyuni, but none of the providers can be installed
      • conflicting requests
        (try to add '--allowerasing' to command line to replace conflicting packages or '--skip-broken' to skip uninstallable packages or '--nobest' to use not only best candidate packages)
        package salt is not installed
        ERROR: Failed to install all missing packages.
    [root@foo ~]# dnf search python3-tornado
    Last metadata expiration check: 0:00:11 ago on Sat 28 Mar 2020 08:49:17 AM CDT.
    ============================================================================= Name Exactly Matched: python3-tornado ==============================================================================
    python3-tornado.x86_64 : Scalable, non-blocking web server and tools
    [root@foo ~]# dnf info python3-tornado
    Last metadata expiration check: 0:00:17 ago on Sat 28 Mar 2020 08:49:17 AM CDT.
    Available Packages
    Name : python3-tornado
    Version : 6.0.2
    Release : 1.el8
    Architecture : x86_64
    Size : 722 k
    Source : python-tornado-6.0.2-1.el8.src.rpm
    Repository : susemanager:bootstrap
    Summary : Scalable, non-blocking web server and tools
    URL : http://www.tornadoweb.org
    License : ASL 2.0
    Description : Tornado is an open source version of the scalable, non-blocking web
    : server and tools.
    :
    : The framework is distinct from most mainstream web server frameworks
    : (and certainly most Python frameworks) because it is non-blocking and
    : reasonably fast. Because it is non-blocking and uses epoll, it can
    : handle thousands of simultaneous standing connections, which means it is
    : ideal for real-time web services.
    Pau Garcia Quiles
    @paususe
    @EthanB11 still, this does not make sense to me:
    Source : python-tornado-6.0.2-1.el8.src.rpm
    Repository : susemanager:bootstrap
    Pau Garcia Quiles
    @paususe
    mmm oh yeah. Maybe this is what happened:
    1. You enabled the CentOS repos, the RHEL/CentOS/SLESES 8 client tools and EPEL 8 on the Server
    2. When the reposync finished, the bootstrap repository was automatically generated
    3. The most recent version of python-tornado was put in the bootstrap repository. Problem is since EPEL was enabled, it was the version from EPEL, not the version from Uyuni.
    4. You haven not performed a full sync again since then (or the sync brought no changes), so the bootstrap repository was not generated again, so you still have the version from EPEL in there
    if I guessed correctly, then it will be solved by disabling EPEL and either a) performing a full reposync or, b) manually regenerating the bootstrap repository
    Ethan Bonick
    @EthanB11
    I removed all the packages from the child EPEL repo and re-ran the mgr-create-bootstrap-repo for centos8. The web bootstrapping did not work. I had to use a custom bootstrap.sh.
    pkg|-salt-minion-package|-salt-minion|-latest(retcode=2): No information found for 'salt-minion'. file|-/etc/salt/minion.d/susemanager.conf|-/etc/salt/minion.d/susemanager.conf|-managed(retcode=2): One or more requisite failed: bootstrap.salt-minion-package file_|-/etc/salt/minionid|-/etc/salt/minionid|-managed(retcode=2): One or more requisite failed: bootstrap.salt-minion-package file|-/etc/salt/pki/minion/minion.pub|-/etc/salt/pki/minion/minion.pub|-managed(retcode=2): One or more requisite failed: bootstrap.salt-minion-package file|-/etc/salt/pki/minion/minion.pem|-/etc/salt/pki/minion/minion.pem|-managed(retcode=2): One or more requisite failed: bootstrap.salt-minion-package service|-salt-minion|-salt-minion_|-running(retcode=2): One or more requisite failed: bootstrap./etc/salt/pki/minion/minion.pub, bootstrap./etc/salt/minion.d/susemanager.conf, bootstrap./etc/salt/pki/minion/minion.pem, bootstrap.salt-minion-package, bootstrap./etc/salt/minion_id