- Join over
**1.5M+ people** - Join over
**100K+ communities** - Free
**without limits** - Create
**your own community**

- Sep 15 08:05stuartarchibald commented #126
- Sep 11 13:25
hameerabbasi on master

Remove extra debugging line int… (compare)

- Sep 10 10:57hameerabbasi commented #406
- Sep 10 07:43
hameerabbasi on master

Fix a number of random bugs. (#… (compare)

- Sep 10 07:43hameerabbasi closed #407
- Sep 10 07:43hameerabbasi closed #406
- Sep 10 06:53codecov[bot] commented #407
- Sep 10 06:53codecov[bot] commented #407
- Sep 10 06:50hameerabbasi synchronize #407
- Sep 10 06:28hameerabbasi closed #408
- Sep 10 06:28hameerabbasi commented #408
- Sep 10 06:27hameerabbasi synchronize #407
- Sep 10 05:29
hameerabbasi on master

add _prune (#409) add _prune … (compare)

- Sep 10 05:29hameerabbasi closed #409
- Sep 09 22:33codecov[bot] commented #409
- Sep 09 22:33codecov[bot] commented #409
- Sep 09 22:30daletovar synchronize #409
- Sep 09 22:29daletovar opened #409
- Sep 09 16:15hameerabbasi edited #407
- Sep 09 16:14hameerabbasi labeled #408

You might want to mention this on that github PR though just to bring things to his attention. I'm not sure how often he checks gitter

@ahwillia My paper has officially been accepted. You can find it at http://conference.scipy.org/proceedings/scipy2018/hameer_abbasi.html

**[Unknown, Quansight]**

Pull request opened by eric-wieser

pydata/sparse#307

Over in pygae/clifford, we're finding we want to be able to write optimized loops using `arr.coords`

and `arr.data`

.

This aims to be the minimum amount needed to make that work.

Marking as draft since this has no tests, and `@lower_constant`

isn't working for me (cc @stuartarchibald)

Issue opened by gongliyu

pydata/sparse#308

scipy.sparse array support np.shape, but sparse.COO does not.

```
In [1]: import sparse
In [2]: import numpy as np
In [3]: import scipy.sparse
In [4]: x = sparse.random((3, 4, 5))
In [5]: x.shape
Out[5]: (3, 4, 5)
In [6]: np.shape(x)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-6-b17f5597101a> in <module>
----> 1 np.shape(x)
<__array_function__ internals> in shape(*args, **kwargs)
TypeError: no implementation found for 'numpy.shape' on types that implement __array_function__: [<class 'sparse._coo.core.COO'>]
In [7]: y = scipy.sparse.random(3, 4)
In [8]: np.shape(y)
Out[8]: (3, 4)
```

*1 new commit pushed to

`master`

*pydata/sparse@b4bf66d - Fix attribute access for __array_function__. (#309)

**[Unknown, Quansight]**

Issue opened by hameerabbasi

pydata/sparse#310

I was considering using `cppyy`

as the C++/Python wrapper as it allows creating runtime bindings. Everything else that can wrap TACO (pydata/sparse#299) practically can't create runtime types, and for Numba a re-implementation of all necessary parts would be needed.

For background, the reason we need runtime types is a type explosion within TACO.

*1 new commit pushed to

`master`

*pydata/sparse@40138e0 - ENH: Add numba extension support for COO objects (#307)

Issue opened by eric-wieser

pydata/sparse#311

It's been a while since the last release, might be nice to make another one.

*1 new commit pushed to

`master`

*pydata/sparse@05e981c - Add changelog for 0.9.0. (#312)

*1 new commit pushed to

`master`

*pydata/sparse@6a02779 - Add changelog for 0.9.1. (#313)

`sparse`

version 0.9.1 is up!
**[Unknown, Quansight]**

Issue opened by ezdac

pydata/sparse#314

I initialize a `DOK`

-array with specific `dtype`

,

don't set any value on it and convert it to a `COO`

:

```
import numpy
import sparse
dok = sparse.DOK(1024, dtype=numpy.uint8)
coo = sparse.COO(dok)
```

The `dtype`

of the `DOK`

doesn't get applied to the `COO`

.

```
dok.dtype
>>> dtype('uint8')
coo.dtype
>>> dtype('float64')
```

This is only the case when I set a nonzero value on the `DOK`

beforehand:

```
dok = sparse.DOK(1024, dtype=numpy.uint8)
dok[0] = 1
coo = sparse.COO(dok)
coo.dtype
>>> dtype('uint8')
coo.dtype == dok.dtype
>>> True
```

You might argue that instead I should use `sparse.zeros()`

in the first place,

and while I generally agree, I think that the above behaviour is unexpected to me and should probably be changed.

*1 new commit pushed to

`master`

*pydata/sparse@77791e8 - Fix incorrect dtype on empty DOK->COO. (#315)

Issue opened by crumleyc

pydata/sparse#316

I am trying to compute the outer product between a dask array (a) and a COO sparse array (b) by calling

`da.outer(a,b)`

. It looks like it is using the correct protocol, I am just wondering if there is a better way to go about this?
*1 new commit pushed to

`master`

*pydata/sparse@9dc40e1 - Add flatten method and outer function. (#317)