Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    Andi Barbour
    (josh) another pretty good error:
    In [5]: RE(count([dif_beam], num=2))
    Transient Scan ID: 17 Time: 2019-10-23 18:00:49
    Persistent Unique Scan ID: '00106dc8-d666-4cad-a07b-5c239fd2e564'
    New stream: 'baseline'
    New stream: 'primary'
    /tmp/test_new_stuff/bin/python: relocation error: /tmp/test_new_stuff/lib/python3.7/site-packages/PyQt5/Qt/plugins/platforms/../../lib/libQt5DBus.so.5: symbol dbus_message_get_allow_interactive_authorization, version LIBDBUS_1_3 not defined in file libdbus-1.so.3 with link time reference
    (/tmp/test_new_stuff) [xf23id1@xf23id-ws2 ~]$
    Andi Barbour
    used hw.det1 and hw.motor1 hinted and reproduced
    Andi Barbour
    reprotduce the behvior above
    note we did everyithing startin ipython with ipython3 in order to get python version 3.....
    Thomas A Caswell
    that suggests something did not activate correctly...
    Takahiro Matsumoto

    Hello, I tried to install bluesky for trial use. But it seems that databroker cannot be installed with python3.8. Is python3.8 not supported for bluesky yet?

    (bluesky-tutorial) username@host[1007]$conda install -c lightsource2-tag bluesky ophyd databroker ipython matplotlib
    Fetching package metadata ...............
    Solving package specifications: .

    UnsatisfiableError: The following specifications were found to be in conflict:

    • databroker
    • python 3.8*
      Use "conda info <package>" to see the dependencies for each package.
    Why LiveGrid not update live?
    Thomas A Caswell
    The packaging has not caught up with python3.8 yet
    This is to say, the code is compatible, but the conda packages are not built yet
    Thank you for the information on python3.8!
    I read bluesky tutorial on https://nsls-ii.github.io/bluesky/tutorial.html
    and then I would like to try the case to use bluesky
    with our control system at SPring-8(unfortunately not EPICS).
    In this case, I may need to customize ophd in https://nsls-ii.github.io/ophyd/
    I would appreciate if I can reffer some simple examples for
    read/set with ophyd without EPICS.
    Dan Allan
    @matumot The implementations in the ophyd.sim module may be of interest. Those return values that are either static or randomly generated, but they illustrate the principle. You would want to write a variation that does whatever hardware communication you need.
    @danielballan Thank you for your information. I try to check it. I also found ophyd-tango in https://github.com/bluesky/ophyd-tango This may be also useful for trial use of bluesky with control framework not for EPICS.
    Hello, and thanks for all the work on bluesky. I was looking to implement some of my own plans, and saw the Msg() function from bluesky.utils being used quite a lot in some of the available plans. Is there any documentation on what types of messages are valid and what they do (eg. What exactly does Msg('create') do).
    As I was playing around I also noticed there's a required order to messages (eg. 'create' must be sent before 'save'). I'd like to learn more about these protocols if possible
    Thomas A Caswell
    @tangkong I'm excited you are digging down to that level!
    the cmd of Msg is used to do dispatch inside the RunEngine and it is possible to register additional commands
    https://blueskyproject.io/bluesky/run_engine.html documents the logic of how the main run loop works
    Ah, thanks @tacaswell ! I'm sure I'll be back with more questions later, thanks for the direction.
    Hello again, I was looking into handling image files, and one of the notebooks in the binder examples pointed to this missing page: https://nsls-ii.github.io/databroker/assets.html
    Is there another place to get this information? I'm trying to set up my own databases for simulation's sake and would like to read a bit more about the best way to do this
    Dan Allan
    Oops, apologies for the delayed reply @tangkong. I took that down because it was so outdated it had become anti-helpful. Take a look at Resource and Datum here https://blueskyproject.io/event-model/data-model.html#resource-document. Will try to post new narrative documentation soon and post a link back here.
    Peter Johnsen

    Hello! It seems that when I attempt to install bluesky and associated packages using conda, the installation fails because of package conflicts. Using pip works fine, but I'd prefer not to use pip together with conda if possible. Any suggestions on how to get this working for conda? Output posted below:

    $ conda install -c lightsource2-tag bluesky ophyd databroker
    Collecting package metadata (current_repodata.json): done
    Solving environment: failed with initial frozen solve. Retrying with flexible solve.
    Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
    Collecting package metadata (repodata.json): done
    Solving environment: failed with initial frozen solve. Retrying with flexible solve.
    Solving environment: /
    Found conflicts! Looking for incompatible packages.
    This can take several minutes. Press CTRL-C to abort.

    UnsatisfiableError: The following specifications were found to be incompatible with each other:

    Package networkx conflicts for:
    ophyd -> networkx
    Package msgpack-numpy conflicts for:
    databroker -> msgpack-numpy
    bluesky -> msgpack-numpy
    Package boltons conflicts for:
    databroker -> boltons
    Package ipython conflicts for:
    bluesky -> ipython
    Package six conflicts for:
    databroker -> six
    Package databroker conflicts for:
    bluesky -> databroker
    Package numpy conflicts for:
    databroker -> numpy
    bluesky -> numpy
    ophyd -> numpy
    Package suitcase-msgpack conflicts for:
    databroker -> suitcase-msgpack
    Package glueviz conflicts for:
    databroker -> glueviz
    Package pyyaml conflicts for:
    databroker -> pyyaml
    Package msgpack-python conflicts for:
    databroker -> msgpack-python
    bluesky -> msgpack-python
    Package super_state_machine conflicts for:
    bluesky -> super_state_machine
    Package lmfit conflicts for:
    bluesky -> lmfit
    Package ophyd conflicts for:
    bluesky -> ophyd
    Package event-model conflicts for:
    databroker -> event-model[version='>=1.11.1']
    bluesky -> event-model[version='>=1.10.0']
    Package pandas conflicts for:
    databroker -> pandas
    Package suitcase-mongo conflicts for:
    databroker -> suitcase-mongo
    Package doct conflicts for:
    bluesky -> doct
    databroker -> doct
    Package intake conflicts for:
    databroker -> intake[version='>=0.5.2']
    Package pyepics conflicts for:
    ophyd -> pyepics
    Package ujson conflicts for:
    databroker -> ujson
    Package attrs conflicts for:
    databroker -> attrs[version='>=16.3.0']
    Package jsonschema conflicts for:
    databroker -> jsonschema
    Package toolz conflicts for:
    databroker -> toolz
    bluesky -> toolz
    Package pims conflicts for:
    databroker -> pims
    Package cycler conflicts for:
    bluesky -> cycler
    Package xarray conflicts for:
    databroker -> xarray
    Package vs2015_runtime conflicts for:
    python=3.7 -> vs2015_runtime[version='>=14.16.27012,<15.0a0'] Package tqdm conflicts for: bluesky -> tqdm Package matplotlib conflicts for: bluesky -> matplotlib Package sqlite conflicts for: python=3.7 -> sqlite[version='>=3.25.3,<4.0a0|>=3.26.0,<4.0a0|>=3.27.2,<4.0a0|>=3.28.0,<4.0a0|>=3.29.0,<4.0a0|>=3.30.1,<4.0a0'] Package pip conflicts for: python=3.7 -> pip Package pytz conflicts for: databroker -> pytz Package tornado conflicts for: databroker -> tornado Package humanize conflicts for: databroker -> humanize Package tifffile conflicts for: databroker -> tifffile Package historydict conflicts for: bluesky -> historydict Package dask conflicts for: databroker -> dask Package jinja2 conflicts for: databroker -> jinja2 Package zict conflicts for: bluesky -> zict Package pymongo conflicts for: databroker -> pymongo Package tzlocal conflicts for: databroker -> tzlocal Package cytoolz conflicts for: databroker -> cytoolz Package h5py conflicts for: databroker -> h5py Package pyzmq conflicts for: bluesky -> pyzmq Package zarr conflicts for: databroker -> zarr Package requests conflicts for: databroker -> requests Package mongoquery conflicts for: databroker -> mongoquery Package openssl conflicts for: python=3.7 -> openssl[version='>=1.1.1a,<1.1.2a|>=1.1.1b,<1.1.2a|>=1.1.1c,<1.1.2a|>=1.1.1d,<1.1.2a'] Package vc conflicts for: python=3.7 -> vc[version='14.*|>=14.1,<15.0a0']

    Dan Allan
    Hi @petercj. My colleague @mrakitin has very recently done some work revamping our conda build process to reuse components from conda-forge. Those packages are at the channel nsls2forge. I believe old lightsource2-tag channel is essentially deprecated at this point, though it is difficult to know how to communicate that to users. (I don't think we want to delete it.) Did some documentation lead you to lightsourc2-tag? Is so please let me know where, and I will update it.
    Peter Johnsen
    Thanks for the info Dan! Yes, I got the info from the bluesky tutorial page here: https://blueskyproject.io/bluesky/tutorial.html
    Dan Allan
    Thank you. Fix proposed here, for the record: bluesky/bluesky#1275
    Maksim Rakitin
    Hello @petercj! Sorry, was busy in the morning and wasn't able to chime in. It looks like you are trying to install the software on a Windows machine, right (vc/vs2015_runtime )? Indeed, the old channel called lightsource2-tag is deprecated and originally was aiming to the linux-64 arch, and a few packages for OSX were added manually there. The old system was too poor to manage and scale as it was the "multiple packages per repo" approach, and we didn't have a way to build packages for Windows. With nsls2forge we completely changed the packaging approach, and now reuse the conda-forge approach of "feedstocks", allowing us to build one package per repo, and, where possible, we use the "noarch" packages, allowing to install a package on any platform and Python version (with some exceptions). Also, if a package is platform-dependent, we build it for all 3 platforms when possible (Linux, OSX, Windows). So, to install the software on Windows, you would need to use nsls2forge as @danielballan suggested, or create a blank conda environment, and install everything from PyPI using pip. Please let us know if you have any questions or problems with the installation
    Emma Cating-Subramanian
    This seems like a really basic question, but I am lost; how do I determine the PV of a specific device? If I have two stages, say, which are both supported by EPICS, how do I get (assign?) their respective PVs so that I can interface with each one independently?
    Thomas A Caswell
    @emca9711 do you have ohpyd object are are trying to create ohpyd objects?
    Thomas A Caswell
    if you need to build up ophyd devices for the stages I suggest looking at https://github.com/bluesky/tutorial/blob/master/Device.ipynb (a live version of which is at try.nsls2.bnl.gov )
    there is the assumption that the PV names of the stages are composed by repeated concatenation of names, the ophyd object defines the post-fixes and you provide the prefix when you instantiate the device so if your motors are named like 'stage1:x', 'stage1:y', 'stage2:x', 'stage2:y'
    and assuming those are motor records, you could define
    from ohpyd import Device, Component as Cpt, EpicsMotor
    class MyStage(Device):
        x = Cpt(EpicsMotor, ':x")
        y = Cpt(EpicsMotor, ':y')
    stage1 = MyStage('stage1', name='stage1')
    stage2 = MyStage('stage2', name='stage2')
    Thomas A Caswell
    having to type the name 3 times in the configuration is not great but something we could not figure out how to get around (the first time is the name in the python namespace which you can't reliably intropsect, the second is the PV prefix which matches in this case, but in general does not, and the last is the name the device thinks it has).
    you can the do stage1.read() (or pass it into the dets list of a plan) to read the stage as a whole or stage1.x to access just the x motor on that stage
    or are you asking about how to set up the IOCs to drive the stage?
    Emma Cating-Subramanian
    I am just trying to communicate with a stage which is ophyd supported vie EpicsMotor. The ophyd documentation says "These devices are have ready-made classes in Python. To configure them, the user need only provide a PV prefix and a name." but I cannot find any information about what PV prefix to use.
    In the example you provided, you never had to provide a device address or anything - so how does bluesky (or ophyd or epics) know which physical device is which?
    is the case of epics motor it is the base-record PV
    (yes our PV names have {} in them, don't ask)
    Andi Barbour
    Minor question for my better understanding that is probably very obvious to you all. If one absolutely needs to add a sleep time to an ophyd device, what is the best function to use? time.sleep() because most people use from time import sleep in the name space or they could sleep = bps.sleep or some other crazy thing is changed to sleep.
    Thomas A Caswell
    if you really need to be sure, you can do the import from time in the function where you need it
    so you are not at the mercy of what has been done outside
    Emma Cating-Subramanian
    Thanks Thomas. I think I mis-phrased my question. How do I enumerate all the devices so that I can obtain their individual PV names?
    Thomas A Caswell
    EPICs does not have a central database of devices, it instead uses a UDP broadcast to find the servers. The client basically screams at the wind "Who has PV XXXYYY?" and if a server responds "I do!" they set up a tcp connection and talk over that going forward
    there has to be some side-band process (typically via a human) to get the base PV names into python
    SLAC has a project to maintain a database of devices + their ophyd classes to make this easier (https://github.com/pcdshub/happi)
    Emma Cating-Subramanian
    Thank you!