Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 17:04
    scottshambaugh converted_to_draft #21426
  • 16:37
    tacaswell commented #21446
  • 16:35
    tacaswell milestoned #21446
  • 16:34
    tacaswell unlabeled #21462
  • 16:34
    tacaswell commented #21462
  • 15:42
    anntzer commented #16865
  • 15:22
    RebeccaWPerry commented #16865
  • 15:08
    mablue commented #21462
  • 15:06
    mablue commented #21462
  • 15:06
    mablue commented #21462
  • 15:00
    mablue commented #21462
  • 14:56
    anntzer commented #21381
  • 14:56
    anntzer commented #21381
  • 14:51
    scottshambaugh synchronize #21426
  • 14:37
    jklymak labeled #21381
  • 14:37
    jklymak unlabeled #21381
  • 14:23
    WeatherGod commented #21462
  • 14:23
    mablue commented #21462
  • 14:20
    jklymak labeled #21463
  • 14:19
    jklymak closed #21463
Jody Klymak
@jklymak
Because its meant to look good from a "distance".
Individual pixels are not data
Thomas A Caswell
@tacaswell
I am taking the position on this that @anntzer takes on symlog
Jody Klymak
@jklymak
If you need individual pixels to be data, then over-sample
Thomas A Caswell
@tacaswell
with the higher-order kernels resampling in RGBA space generated large regions of not-in-gamut color
Benjamin Root
@WeatherGod
totally agree with @tacaswell here. It was considered a horrendous bug when it was pointed out that we were creating out-of-gamut colors that we were willing to break baseline tests to fix it.
Jody Klymak
@jklymak
What do you mean by "out of gamut"? R>1 and/or R<0? Or just not in the colormap? If it was just out of the colormap, then I disagree - if you are subsampling an image, then you will get new colors at the color boundaries and they will be perceptually smoothing the adjacent colors.
Thomas A Caswell
@tacaswell
I mean color not in the colormap
it makes the image completely non-interpret-able
how to you show "red" in a viridis colorbar?
Thomas A Caswell
@tacaswell
left a long comment with pictures on the issue
in very short: core of the problem is using continuous resampling on categorical data and despite the current order being hard, doing it in the other order is un-equivocally wrong
Antony Lee
@anntzer
I have a vague feeling that the correct approach may be 1) to resample in data space when smoothing (i.e. large pixels, few data points, and using e.g. sinc) because in that case out-of-gamut colors are just ridiculous (this is the case in your reply) and we can see the operation as interpolating undersampled data points first, but 2) to resample in color space in the opposite case of multiple data points going into a single pixel, because I tend to imagine this (as suggested by the OP) as "viewing the image from afar" (although really averaging should be done in some visual space such as Lab instead of RGB, but heh...)
Jody Klymak
@jklymak
Right - I was just writing something like this. There are two things the user wants: to smooth their data so its not so "blocky" and to have a good looking image. They are different operations, one should be done in data space and one in color space.
Antony Lee
@anntzer
1) is about smoothing data to look non-blocky, which I think may or may not be something that we really need to provide (at least in theory we could always say, we're not going to interpolate the data for you), but 2) is really about displaying something sensible when there are more points than pixels (and in that case straight decimation, which was the previous default, can easily give very misleading results (thanks for implementing interpolation="antialiased" and fixing that :)))
Ian Hunt-Isaak
@ianhi
Are %matplotlib widget and %matplotlib ipympl the same thing? I really thought they were but https://github.com/matplotlib/matplotlib/issues/18741#issuecomment-709455563 has be confused
Thomas A Caswell
@tacaswell
2) still needs to be done in data space though. If we have data that is a checkerboard of 0 and 1 values, one screen pixel covers many data values so the average of the data should be 0.5 and you should get a uniform color in the middle of the color map, not the average of the top and bottom colors
@ianhi Yes, they should be
Ryan May
@dopplershift
I'm with @tacaswell regarding the image operations, but I'll take any fruther comments I have to add over to the issue.
Bruno Beltran
@brunobeltran
any easy way to answer discourse questions that are being forwarded from [matplotlib-users]?
only just realized we have such a list, but it seems like i'll have to figure out how to get the mailer to send me the historical message so that I can respond to it via email?
Bruno Beltran
@brunobeltran
(I tried nabble, but the ability to "reply" there is for "authorized users only")
Thomas A Caswell
@tacaswell
to post to the mailing list you need to be on the mailing list
Bruno Beltran
@brunobeltran
hmm I am on the mailing list, just need somewhere i can look up the correct "in-reply-to" for a message that came in before I was added to the list
Antony Lee
@anntzer
@tacaswell you should get the average of top and bottom colors if your mental model is "I'm looking at the checkerboard from very far away"...
Thomas A Caswell
@tacaswell
@brunobeltran oh, hmm, interesting not sure
Ryan May
@dopplershift
@anntzer Nope, if I sample a checkerboard of temperatures, I get a pixel that measures at a value of the average of the top and bottom temperatures.
Bruno Beltran
@brunobeltran

well if somebody who is on the mailing list just wants to do a zero-effort user help, feel free to copy/paste respond to most recent message with:

You should be able to manually increase the maximum number of "ticks" that you find acceptable by using matplotlib.ticker.LogLocator(numticks=N), like so:

import matplotlib.pyplot as plt
import numpy as np
import matplotlib.ticker

t = np.logspace(0, 4, 10)
fig = plt.figure()
ax = fig.add_subplot()
ax.loglog(t, t**3)

enough_ticks = matplotlib.ticker.LogLocator(numticks=15)
ax.get_yaxis().set_major_locator(enough_ticks)
i'm on the list now so I can deal with follow-ups
Thomas A Caswell
@tacaswell
I have the sense there should be a better way of telling either the locator or the formatter to stop trying to be helpful
there is a lot of magic in the log ticker and locator...
Bruno Beltran
@brunobeltran
yeah I get that sense too, but it's not well documented (was literally opening an issue about LogLocator.__init__'s docstring right now, and I don't have time to figure out how it works rn and what I suggest does work, so
🤷
Ryan May
@dopplershift
Didn't we just merge a PR suggesting using yaxis directly instead of get_yaxis()?
Thomas A Caswell
@tacaswell
yes
Bruno Beltran
@brunobeltran
ooops, thanks for that, you're right
Ryan May
@dopplershift
No worries
Bruno Beltran
@brunobeltran
muscle memory
Thomas A Caswell
@tacaswell
done
did the mailing list tell you you had to join to post?
Ryan May
@dopplershift
Regarding the checkerboard, I think of downsampling as a zoomed out view of the data (or the real world) rather than the a zoomed out view of the picture we've made of the data (the RGB values).
Bruno Beltran
@brunobeltran
@tacaswell I think I can post just fine, it's just having the message to reply to in the first place ;)
Jody Klymak
@jklymak
Lets continue checkerboard discussion at the issue; I think we should actually do both and make them clearer to the user.
hannah
@story645
GSOC is reducing scope next year (half the money & half the time expectation) https://twitter.com/pavithraes/status/1316816191235579904?s=19
Jody Klymak
@jklymak
Hi all,
Thomas A Caswell
@tacaswell
I may be a few minutes late, calling in from a different computer and apparently have to update zoom :facepalm:
Jody Klymak
@jklymak
@anntzer suggests that it would be relatively easy to vendor the agg filters, presumably w/o the 0-1 stricture which makes it a problem to deal with high-bandwidth data. OTOH, are we dead against scipy.ndimage? I've been told scipy is too big a dependancy, but is that really true in 2020? What fraction of folks do we think are installing matplotlib, but not scipy?
Ryan May
@dopplershift
It's not just that scipy is heavy weight, it's that scipy requires a FORTRAN compiler. Currently, conda-forge still can't build windows packages of scipy on windows-based CI infrastructure.
I hate the idea that we reimplement the image processing that already exists in scipy.ndimage. At the same time, scipy is still not a trivial dependency.
Jody Klymak
@jklymak
As a point of reference, skimage uses scipy.ndimage.