Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Michael Merrill
    @mhmerrill
    do we have a topic for today's Arkouda Weekly Call?
    Michael Merrill
    @mhmerrill
    If there are NO topics for today's meeting then we'll cancel
    pierce314159
    @pierce314159
    Arkouda v2022.04.15 was just released! Thanks to everyone who contributed!
    https://github.com/Bears-R-Us/arkouda/releases/tag/v2022.04.15
    Michael Merrill
    @mhmerrill
    unless we have a topic I am going to cancel today's Arkouda weekly call
    sorry for the late notice
    pierce314159
    @pierce314159
    Hi everyone! Today's Arkouda weekly call is canceled because Mike has another meeting
    pierce314159
    @pierce314159
    Arkouda v2022.05.05 was just released! Thanks to everyone who contributed!
    https://github.com/Bears-R-Us/arkouda/releases/tag/v2022.05.05
    Chris Long
    @compiling-is-winning
    Hi all, stupid questions as I haven't built Arkouda since January, both related to Apache Arrow (I assume for the new Parquet support):
    1) On my Mac laptop, I've had no success building dependencies locally with “make install-deps”. However, I can get each of Arrow, HDF5, and ZeroMQ from Brew. I added a lines to Makefile.paths for the paths to /usr/local/Cellar for each of these. It compiles just fine, and I can run a single-locale Arkouda server with basic functionality. However, if I run “make test-all”, I get a bunch of failed tests related to parquet. So am I pointing Arkouda to the right place to build with Arrow via that line in Makefile.paths?
    2) On Sherlock, I build hdf5 and zeromq fine with “make install-reps”. However, I get a CMake error when trying to build Arrow. Initially, the error was that it wasn’t finding Boost. So I installed a later version of CMake, but am now getting a errors to the effect that CMake is looking for modules in the wrong directory and cannot find CMAKE_ROOT. Has anyone else encountered this?
    Brad Chamberlain
    @bradcray
    For #1, I have not tried to build Parquet / Arrow on my Mac but have not had to manually add paths to brew packages for other things. I guess my main thought there is whether some additional brew step is required to add the brew paths to your environment’s search path variables or the like? E.g., would a normal C compile find the headers and libraries without additional help? Some brew install steps ask you to add some lines to your .bashrc to enable them, though it’s been awhile since I’ve had to do one of those, so I’m not certain.
    For the Parquet question, I’m tagging @bmcdonald3 who’s the resident expert.
    I have not seen that specific cmake error you’re seeing, though in the plan-old Chapel context, I have seen cases where our build process will embed cmake paths into some sort of files, and I’ve had to do a fairly extensive clobber to clean things up and rebuild once I’ve switched cmake versions. I haven’t yet taken the time to figure out why that is, or whether our Makefiles could be updated to take care of it automatically. I wonder whether the same could be true with your Sherlock build as well, though the context is definitely different…?
    pierce314159
    @pierce314159
    hey @compiling-is-winning, @Ethan-DeBandi99 says he'll sync with u about this tomorrow, but we think the failed tests might be solved by PR Bears-R-Us/arkouda#1392
    Chris Long
    @compiling-is-winning
    @bradcray thanks! My guess is that it would. If I just run "which h5ls" (or something analogous for the zeromq and arrow dependencies) I get a path to a system directory that, on further inspection, contains a symbolic link to the directory in Cellar where it was installed with Brew. For the CMake issues on Sherlock, I will clobbering all of my existing installs later tonight
    @pierce314159 cool, thanks! If I run "make test-all", I do indeed get one error in dataframe_test.py and two in io_test.py, with the same routines as mentioned in the PR you linked above
    Chris Long
    @compiling-is-winning
    Looks like with "make test-chapel" I get one error with UnitTestParquetCpp
    Michael Merrill
    @mhmerrill
    @compiling-is-winning i just checked in the PR that fixes the problem i think
    pierce314159
    @pierce314159
    also @compiling-is-winning, do you have the ARKOUDA_SERVER_PARQUET_SUPPORT env variable set? You might need to run export ARKOUDA_SERVER_PARQUET_SUPPORT=trueand rebuild the server
    Chris Long
    @compiling-is-winning
    @mhmerrill @pierce314159 thanks, that cleared up all but the dataframe_test issue with the Python tests on my Mac. Still fighting cmake on Sherlock
    Michael Merrill
    @mhmerrill
    Anyone have a topic for today’s call?
    Michael Merrill
    @mhmerrill
    Today's Arkouda Call will be further discussion of Radix sort structure and mason packages , are people ok with these topics?
    Rajendra Prasad Patil
    @rp98njit

    Hi everyone,

    I am trying to build arkouda
    I am getting this error.

    make: Entering directory /home/r/rp98/arkouda' make compile-arrow-cpp make[1]: Entering directory/home/r/rp98/arkouda'
    g++ -O3 -std=c++11 -c /home/r/rp98/arkouda//src/ArrowFunctions.cpp -o /home/r/rp98/arkouda//src/ArrowFunctions.o -I/home/r/rp98/anaconda3/envs/arkouda/include -L/home/r/rp98/anaconda3/envs/arkouda/lib
    In file included from /home/r/rp98/arkouda//src/ArrowFunctions.cpp:1:0:
    /home/r/rp98/arkouda//src/ArrowFunctions.h:7:10: fatal error: arrow/api.h: No such file or directory

    include <arrow/api.h>

          ^~~~~~~~~~~~~

    compilation terminated.
    make[1]: [compile-arrow-cpp] Error 1
    make[1]: Leaving directory `/home/r/rp98/arkouda'
    make:
    [/home/r/rp98/arkouda//src/ArrowFunctions.o] Error 2
    make: Leaving directory `/home/r/rp98/arkouda'
    (arkouda) login-1-57 arkouda-njit >:

    Thank you in advance

    Michael Merrill
    @mhmerrill
    @rp98njit we’ve got someone looking into it, I might have to do with a PR we merged yesterday which made Parquet/Arrow a non optional part of the build
    Ethan-DeBandi99
    @Ethan-DeBandi99
    @mhmerrill /@rp98njit it definitely is related to that. I am looking to see what specifically is not configured to cause that.
    @rp98njit Can you confirm a few things for me? Do you have the LD_LIBRARY_PATH environment variable set?
    2 replies
    Ethan-DeBandi99
    @Ethan-DeBandi99
    @rp98njit Also, have you run make install-arrow?
    Ethan-DeBandi99
    @Ethan-DeBandi99
    Ok - I believe the issue is that arrow is not installed running make install-arrow should resolve this. I have added an issue to update our install documentation to reflect this
    15 replies
    Ethan-DeBandi99
    @Ethan-DeBandi99

    @rp98njit I created a bare bones conda environment to run the install in and make sure everything is functional. Here is the list of packages that need to be installed using conda before make install-arrow will run.

    • boost-cpp
    • snappy
    • thrift-cpp
    • re2
    • utf8proc

    Please let me know if it gives you any additional problems after installing the packages listed above.

    Ethan-DeBandi99
    @Ethan-DeBandi99
    Additionally, here is the command if using pip,
    pip install boost snappy thrift re2 utf8proc
    pierce314159
    @pierce314159
    Arkouda v2022.06.06 was just released! Thanks to everyone who contributed!
    https://github.com/Bears-R-Us/arkouda/releases/tag/v2022.06.06
    Michael Merrill
    @mhmerrill
    Today's Arkouda Weekly Call is cancelled.
    Engin Kayraklioglu
    @e-kayrakli
    @/all In case you missed it, CHIUW is today and we just started. Just register here and hop in! https://hpe.zoom.us/meeting/register/tJEoc-2orzotH9QYrnGAJOIGveblfnGKtNw0 (Ooh, Brad is talking about Arkouda right now)
    Rajendra Prasad Patil
    @rp98njit

    Hi all,
    Thanks in advance.

    How can I verify? I installed the chapel with the "LLVM=bundled" configuration.

    Michael Merrill
    @mhmerrill
    @rp98njit you should put that question in the chapel gitter channel https://gitter.im/chapel-lang/chapel
    pierce314159
    @pierce314159
    Michael Merrill
    @mhmerrill
    Anyone have a topic for today's Arkouda weekly call? https://github.com/Bears-R-Us/ArkoudaWeeklyCall
    @/all Anyone have a topic for today's Arkouda weekly call? https://github.com/Bears-R-Us/ArkoudaWeeklyCall
    Michael Merrill
    @mhmerrill
    @/all if no takers to today's call we will cancel, any takers...
    @/all we will spend today's meeting talking about and recapping CHIUW.
    pierce314159
    @pierce314159
    just circling back on @rp98njit's issue, it was resolved by running conda install -c conda-forge pyarrow
    Zhihui Du
    @zhihuidu

    addEntry problem.
    In my tests for a large graph amazon0601.txt, the st.addEntry function will be blocked somewhere after giving the following information
    "2022-06-18:11:58:59 [ServerConfig] overMemLimit Line 202 INFO [Chapel] memory high watermark = 267023064 memory limit = 486862464614"
    No error information is given. It seems that it enters into an infinite loop.

    For small graphs, everything is good.

    Any suggestions on solving this problem? Thanks!

    Michael Merrill
    @mhmerrill
    @zhihuidu this message is printed when you allocate more memory that at any point before in the run. From the message it appears that you still have plenty of memory left. I don’t think the message is tied to the behavior you are seeing. Are you using standard Arkouda I/O to read in your data?
    Zhihui Du
    @zhihuidu
    Thanks, Mike!
    In the previous tests, I used writeln and it could not show output immediately.
    After using the standard Arkouda I/O, I have located the position of the problem.
    I am looking into the problem to see why a large graph will have an infinite loop.
    Michael Merrill
    @mhmerrill
    @/all anyone have a topic for today's Arkouda Weekly Call?
    Michael Merrill
    @mhmerrill
    @/all if there are no topics to discuss then we will cancel today call.
    Michael Merrill
    @mhmerrill
    @/all would people be interested in me talking about a new/experimental way of interpreting arkouda array computation?
    pierce314159
    @pierce314159
    Hi everyone! The Arkouda weekly call for today is canceled
    Chris Long
    @compiling-is-winning
    I ask because I'm currently working on a server-side algorithm that will require concatenating two integers (one a set element index, the other a hash index) into a single uint64 that will be used to seed a random number generator. So it would be easiest if both indices above were uint32's, though not necessarily a deal-breaker if not.