profile
viewpoint
Ondrej Kokes kokes Prague, Czech Republic Economist/data wrangler/table maker/chart producer

kokes/nbviewer.js 212

Client side rendering of Jupyter notebooks

kokes/od 75

Česká otevřená data

HlidacStatu/Volicsky-Prukaz 17

Webová aplikace, která připraví žádost o volební průkaz pro volby (aktuálně do Evropského parlamentu v 2019). Vygenerovanou žádost si volič může stáhnout podepsat a stahnout jako PDF.

kokes/knod 8

Katalog nejen otevřených dat

kokes/cedr 4

Zpracování dat k dotacím

kokes/cedr-n3 2

Extrakce dat z centrální evidence dotací (CEDR)

kokes/jManipulate 2

A JavaScript tool to edit CSS in a WYSIWYG fashion.

startedbxcodec/faker

started time in 4 days

startedjoke2k/faker

started time in 4 days

push eventkokes/blog

Ondrej Kokes

commit sha ab8f747c59b8b47f7a83ba8136715738eb9815e2

fixed url

view details

push time in 5 days

push eventkokes/blog

Ondrej Kokes

commit sha 7dcdbfe746e0963bf33806b5feebeede0870a5ad

fixed url

view details

push time in 5 days

push eventkokes/blog

Ondrej Kokes

commit sha 8b2b5a868a29bdd09c2e2a9bfedb07158f57272b

assorted tech links 5

view details

push time in 5 days

issue openedkokes/nbviewer.js

Incorporate changes from Microsoft's fork

Microsoft forked this for their Azure offering. Go through their changelog and see if there are things we could merge in.

They kept the file structure mostly intact, so we could just diff and see it right there.

created time in 6 days

issue openedkokes/od

[psp] view na interpelace

neco na zpusob

with osoby as (
	select
		id_osoba,
		jmeno || ' ' || prijmeni as jmeno
	from psp.poslanci_osoby
)
SELECT
		pr.*, os.jmeno as tazatel, os2.jmeno as dotazovany, los.*, org.*
	FROM
		psp.interp_poradi pr
		inner join osoby os on os.id_osoba = pr.id_poslanec
		inner join osoby os2 on os2.id_osoba = pr.id_ministr
		INNER JOIN psp.interp_los_interpelaci los on los.id_los = pr.id_losovani
		inner join psp.poslanci_organy org on org.id_organ = los.id_org

created time in 7 days

pull request commentnteract/commuter

Add Dockerfile

Is there some reason for installing tini explicitly? Docker has had it for a few years now (PR 1, PR 2), so it's only a matter of --init.

groodt

comment created time in 7 days

issue openedgreat-expectations/great_expectations

Unreadable rendering of failed expectations

I had a multi-column expectation that failed, but the generated docs weren't readable.

To Reproduce Steps to reproduce the behavior:

  1. Create a dataset of two or more columns.
  2. Use expect_column_pair_values_A_to_be_greater_than_B in a way that it will fail
  3. Generate docs

Expected behavior In the "failed values" table, I'd expect either a table column per column of data or at least some visual separation of the pair of values involved in the comparison (comma, dash, pipe, ...)

Environment (please complete the following information):

  • OS: macOS
  • GE Version: 0.10.12

Additional context

I included a screenshot of the issue. The zero in the table cell is the value of column A and the non-zero value is the other column's value. Both values are in separate spans, but they are not styled in any way to differentiate between them. Screen Shot 2020-05-22 at 12 41 04 PM

created time in 8 days

push eventkokes/blog

Ondrej Kokes

commit sha 758030c688cb1a343d8d54fad94daf0ae215700e

postgres column store post

view details

push time in 10 days

issue commentgolang/go

encoding/csv: skipping of empty rows leads to loss of data in single-column datasets

Which suggests that this is a bug in the reader code.

However, I'm not going to be surprised if fixing this breaks lots of users due to Hyrum's law and we're forced to just document the errant behavior. Even though the package tries to follow RFC 4180, CSV files are one of those formats with many strange variants that do not folllow any specific grammar.

If we're worried about breaking existing code, we could add a boolean flag, which defaults to the current behaviour and then we could discuss flipping it (and potentially removing it) for Go 2.


If I understand the implementation correctly, then one cannot just initialise a Reader struct, one has to go through NewReader, because the underlying io.Reader is unexported. In that case, we can enforce our default in the constructor. (Or we could flip the boolean flag to mean e.g. ParseBlankLines, which has the desired default value.)

Something along the lines of this (haven't tested it, just sketching):

diff --git a/src/encoding/csv/reader.go b/src/encoding/csv/reader.go
index c40aa506b0..cd2b0ccfc1 100644
--- a/src/encoding/csv/reader.go
+++ b/src/encoding/csv/reader.go
@@ -16,8 +16,8 @@
 //
 // Carriage returns before newline characters are silently removed.
 //
-// Blank lines are ignored. A line with only whitespace characters (excluding
-// the ending newline character) is not considered a blank line.
+// Blank lines are ignored by default. A line with only whitespace characters
+// (excluding the ending newline character) is not considered a blank line.
 //
 // Fields which start and stop with the quote character " are called
 // quoted-fields. The beginning and ending quote are not part of the
@@ -142,6 +142,9 @@ type Reader struct {
 	// By default, each call to Read returns newly allocated memory owned by the caller.
 	ReuseRecord bool
 
+	// If SkipBlankLines is true (default), rows with no data are skipped.
+	SkipBlankLines bool
+
 	TrailingComma bool // Deprecated: No longer used.
 
 	r *bufio.Reader
@@ -169,8 +172,9 @@ type Reader struct {
 // NewReader returns a new Reader that reads from r.
 func NewReader(r io.Reader) *Reader {
 	return &Reader{
-		Comma: ',',
-		r:     bufio.NewReader(r),
+		Comma:          ',',
+		SkipBlankLines: true,
+		r:              bufio.NewReader(r),
 	}
 }
 
@@ -268,7 +272,7 @@ func (r *Reader) readRecord(dst []string) ([]string, error) {
 			line = nil
 			continue // Skip comment lines
 		}
-		if errRead == nil && len(line) == lengthNL(line) {
+		if r.SkipBlankLines && errRead == nil && len(line) == lengthNL(line) {
 			line = nil
 			continue // Skip empty lines
 		}
kokes

comment created time in 11 days

pull request commentgreat-expectations/great_expectations

[BUGFIX] quantile boundaries can be zero now

(rebased on the current develop branch, hence the force push)

kokes

comment created time in 13 days

push eventkokes/great_expectations

Brendan Alexander

commit sha f43644182142e4e2d5083341535433bc4d5a589a

Allow config substitutions to be passed to DataContext

view details

Brendan Alexander

commit sha 0cb5a73b07e42ebc3f0376bfe36ab71d5ea51110

Unset env variable

view details

Brendan Alexander

commit sha 917495ee59264bc562c04e017808909dfc6645dd

Fix some lines too long for PEP8

view details

Brendan Alexander

commit sha 74931fed2a7606ece8626d46f19e059b4169dccb

Ridiculously short lines for PEP8

view details

Brendan Alexander

commit sha fe3f7cd595c5ed7b6187489938b106011c583b52

Send dictionary of env vars to substitute_config_variable method

view details

Brendan Alexander

commit sha 2ad68872e68968a09753e5cfdc0e868fc4cfc3fd

Break up lines so black is happy

view details

Brendan Alexander

commit sha 5788b06d8fccd1175e2266a5598bfc38d347fe7d

Remove trailing white space

view details

Brendan Alexander

commit sha 97611a8a0cb0080c581aa8f1eb0ae183683e288e

Change config dictionary param to runtime_environment; use black to reformat files

view details

Brendan Alexander

commit sha afd653bb1838b0230a54ddd5024c6ce04b985399

Add change log message and update docs

view details

WilliamWsyHK

commit sha 2551698effd09c0c6314a671849bf5ecf0cab7e3

[FEATURE] Support expect_multicolumn_values_to_be_unique on Spark (#1294) * Add multicolumn_map_expectation wrapper method * Add expect_multicolumn_values_to_be_unique * expect_multicolumn_values_to_be_unique for Spark is now implemented * Add schema for Spark to avoid breaking test cases * Update docs for implemented expect_multicolumn_values_to_be_unique * Change the way how boolean_mapped_skip_values is set * Apply `black` * Add description to change log Co-authored-by: James Campbell <james.p.campbell@gmail.com>

view details

rexboyce

commit sha 4efe6b5b88cdead80ea9f16a75d92e823f9816c7

[BUGFIX] fix extra expectations included by BasicSuiteBuilderProfiler #1422 (#1445) * fix issue where extra expectations included by BasicSuiteBuilderProfiler * ran linter * Update changelog Co-authored-by: James Campbell <james.p.campbell@gmail.com>

view details

Taylor Miller

commit sha 95dc5335eb2186dff98bb9a57eae8eda83da35ab

better checkpoint docstring (#1436) Co-authored-by: James Campbell <james.p.campbell@gmail.com>

view details

Ondrej Kokes

commit sha d1526781119b635cf60ee25eb27c82cb4f9b0cc1

quantile boundaries can be zero now When setting quantile boundaries to zero, they'd be interpreted as "any" (implemented as +-infty). This commit fixes this by evaluating zero boundaries as zeroes and only None/null ones as +-infty.

view details

Ondrej Kokes

commit sha a1b66fc97af1c4b38b025a60fdd86c99cfe663c0

minor python style fixes - native iteration over multiple collections (zip) - leveraging dictionary getter with a default fallback

view details

push time in 13 days

PR opened great-expectations/great_expectations

[BUGFIX] quantile boundaries can be zero now

When setting quantile boundaries to zero, they'd be interpreted as "any" (implemented as +-infty). This commit fixes this by evaluating zero boundaries as zeroes and only None/null ones as +-infty.

I found this bug when I used expect_column_quantile_values_to_be_between and set my quantile to be equal to zero ([0,0]) and when I checked the rendered docs, this was displayed as [Any, Any] - I wondered if this was only presentational, but it wasn't - these quantile ranges were not considered. I wrote a failing test and implemented a fix that turned it green.

There's also an optional second commit, where I fix a bit of code that's not terribly idiomatic - it's the same snippet of code where I fixed the docs issue - it's not like I'm proposing some random piece of code.

+26 -6

0 comment

3 changed files

pr created time in 13 days

push eventkokes/great_expectations

Ondrej Kokes

commit sha 827e79d94429a74edb4050cb224f5df4d7709e45

minor python style fixes - native iteration over multiple collections (zip) - leveraging dictionary getter with a default fallback

view details

push time in 13 days

create barnchkokes/great_expectations

branch : any_bound

created branch time in 13 days

issue openedgreat-expectations/great_expectations

helpful exceptions for empty datasets

I happened to have an empty dataset due to a pipeline failure and then ran some expectations - most of them ran just fine, but one threw an unexpected expectation, which made things hard to track.

To reproduce, write your header into a CSV file and launch init using all the defaults (and pandas).

echo "foo,bar" > foo.csv
great_expectations init

Then launch great_expecatations suite edit foo.warning and try to check for unique values. Instead of telling me "cannot run stats on empty datasets" or something along those lines, the procedure calculates the number of unique values as None, thus triggering a TypeError when comparing this None with float boundaries.

batch.expect_column_proportion_of_unique_values_to_be_between('foo', .5, .9)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-3-7aaacd9190af> in <module>
----> 1 batch.expect_column_proportion_of_unique_values_to_be_between('foo', .5, .9)

~/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/data_asset/util.py in f(*args, **kwargs)
     77         @wraps(self.mthd, assigned=("__name__", "__module__"))
     78         def f(*args, **kwargs):
---> 79             return self.mthd(obj, *args, **kwargs)
     80 
     81         f.__doc__ = doc

~/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/data_asset/data_asset.py in wrapper(self, *args, **kwargs)
    262 
    263                         else:
--> 264                             raise err
    265 
    266                 else:

~/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/data_asset/data_asset.py in wrapper(self, *args, **kwargs)
    247                 ):
    248                     try:
--> 249                         return_obj = func(self, **evaluation_args)
    250                         if isinstance(return_obj, dict):
    251                             return_obj = ExpectationValidationResult(**return_obj)

~/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/dataset/dataset.py in inner_wrapper(self, column, result_format, *args, **kwargs)
     93             null_count = element_count - nonnull_count
     94 
---> 95             evaluation_result = func(self, column, *args, **kwargs)
     96 
     97             if "success" not in evaluation_result:

~/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/dataset/dataset.py in expect_column_proportion_of_unique_values_to_be_between(self, column, min_value, max_value, strict_min, strict_max, result_format, include_config, catch_exceptions, meta)
   3006                 above_min = proportion_unique > min_value
   3007             else:
-> 3008                 above_min = proportion_unique >= min_value
   3009         else:
   3010             above_min = True

TypeError: '>=' not supported between instances of 'NoneType' and 'float'

It might be instructive to see if any other expectations fail on empty dataset.

created time in 13 days

issue openedgolang/go

encoding/csv: skipping of empty rows leads to loss of data in single-column datasets

What version of Go are you using (go version)?

<pre> $ go version go version go1.14 darwin/amd64 </pre>

Does this issue reproduce with the latest release?

Yes.

What operating system and processor architecture are you using (go env)?

<details><summary><code>go env</code> Output</summary><br><pre> $ go env GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/Users/ondrej/Library/Caches/go-build" GOENV="/Users/ondrej/Library/Application Support/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="darwin" GOINSECURE="" GONOPROXY="" GONOSUMDB="" GOOS="darwin" GOPATH="/Users/ondrej/go" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/local/Cellar/go/1.14/libexec" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/local/Cellar/go/1.14/libexec/pkg/tool/darwin_amd64" GCCGO="gccgo" AR="ar" CC="clang" CXX="clang++" CGO_ENABLED="1" GOMOD="" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/hp/q7nph21s1q76nw1hv1hfxv2m0000gn/T/go-build988616937=/tmp/go-build -gno-record-gcc-switches -fno-common" </pre></details>

What did you do?

I wrote a CSV with a single column and missing data.

What did you expect to see?

I expected to load the data back, intact.

What did you see instead?

I lost the missing values, encoding/csv skipped them as it skips blank lines. In this case, a blank line actually represents data.


I'm not sure I understand the rationale behind skipping blank lines. Neither in terms of common practice (why would I have blank lines in my CSVs?) nor in terms of standards (the closest we have is RFC 4180 and I couldn't find anything about blank lines - so I'm not sure if Go follows it).

Here's a reproduction of the problem. I wrote a dataset into a file and was unable to read it back.

<details> <pre> package main

import ( "encoding/csv" "errors" "log" "os" "reflect" )

func writeData(filename string, data [][]string) error { f, err := os.Create(filename) if err != nil { return err } defer f.Close() cw := csv.NewWriter(f) defer cw.Flush() if err := cw.WriteAll(data); err != nil { return err } return nil }

func readData(filename string) ([][]string, error) { f, err := os.Open(filename) if err != nil { return nil, err } defer f.Close() cr := csv.NewReader(f) rows, err := cr.ReadAll() if err != nil { return nil, err } return rows, nil }

func run() error { fn := "data/roundtrip.csv" data := [][]string{{"john"}, {"jane"}, {""}, {"jack"}}

if err := writeData(fn, data); err != nil {
	return err
}

returned, err := readData(fn)
if err != nil {
	return err
}
if !reflect.DeepEqual(returned, data) {
	log.Println("expected", data, "got", returned)
	return errors.New("not equal")
}

return nil

}

func main() { if err := run(); err != nil { log.Fatal(err) } }

</pre> </details>

created time in 16 days

issue commentgreat-expectations/great_expectations

NameError: name 'WithinGroup' is not defined

Duplicate of #1443.

shahinism

comment created time in 16 days

issue commentgreat-expectations/great_expectations

Lack of optional dependency crashes init

@eugmandel I understand that, but GE crashes before I connect to a database, before I do anything, really. And it’s not a “I don’t have SQLAlchemy error”, it’s due to a lack of type information.

I even included a trivial way to reproduce the problem.

kokes

comment created time in 17 days

issue openedgreat-expectations/great_expectations

Lack of optional dependency crashes init

When I install Great Expectations without SQL Alchemy, great_expectations init will fail due to a missing type information.

To Reproduce Steps to reproduce the behavior:

docker run -it --rm python:3-slim bash
pip3 install great_expectations
great_expectations init

Expected behavior I expected init to run. (After I manually installed sqlachemy, it did work.)

The reason for this is that you allow SQL Alchemy to be not imported (https://github.com/great-expectations/great_expectations/blob/develop/great_expectations/dataset/util.py#L16), but later require its type information (https://github.com/great-expectations/great_expectations/blob/develop/great_expectations/dataset/util.py#L578).

Environment (please complete the following information):

  • OS: macOS, Ubuntu
  • GE Version: latest, 0.10.11

Additional context

traceback:

<details> <pre> $ great_expectations -v init Traceback (most recent call last): File "/Users/okokes/.pyenv/versions/3.7.7/bin/great_expectations", line 5, in <module> from great_expectations.cli import main File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/init.py", line 7, in <module> from great_expectations.data_context import DataContext File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/data_context/init.py", line 3, in <module> from .data_context import BaseDataContext, DataContext, ExplorerDataContext File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/data_context/data_context.py", line 20, in <module> from great_expectations.core.usage_statistics.usage_statistics import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/core/usage_statistics/usage_statistics.py", line 20, in <module> from great_expectations.core.usage_statistics.anonymizers.data_docs_site_anonymizer import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/core/usage_statistics/anonymizers/data_docs_site_anonymizer.py", line 2, in <module> from great_expectations.core.usage_statistics.anonymizers.site_builder_anonymizer import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/core/usage_statistics/anonymizers/site_builder_anonymizer.py", line 2, in <module> from great_expectations.render.renderer.site_builder import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/render/renderer/init.py", line 7, in <module> from .other_section_renderer import ProfilingResultsOverviewSectionRenderer File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/render/renderer/other_section_renderer.py", line 4, in <module> from great_expectations.profile.basic_dataset_profiler import BasicDatasetProfiler File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/profile/init.py", line 1, in <module> from .basic_dataset_profiler import BasicDatasetProfiler File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/profile/basic_dataset_profiler.py", line 3, in <module> from great_expectations.profile.base import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/profile/base.py", line 8, in <module> from ..dataset import Dataset File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/dataset/init.py", line 3, in <module> from .dataset import Dataset File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/dataset/dataset.py", line 13, in <module> from great_expectations.dataset.util import ( File "/Users/okokes/.pyenv/versions/3.7.7/lib/python3.7/site-packages/great_expectations/dataset/util.py", line 578, in <module> selects: List[WithinGroup], sql_engine_dialect: DefaultDialect NameError: name 'WithinGroup' is not defined </pre> </details>

created time in 18 days

issue openedgolang/go

cmd/link: binary on darwin takes up more space on disk

What version of Go are you using (go version)?

<pre> $ go version go version devel +8ab37b1baf Mon Apr 20 18:32:58 2020 +0000 darwin/amd64 </pre>

Does this issue reproduce with the latest release?

No (as in this is a regression in the current master, the latest stable is fine)

What operating system and processor architecture are you using (go env)?

<details><summary><code>go env</code> Output</summary><br><pre> $ go env GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/Users/okokes/Library/Caches/go-build" GOENV="/Users/okokes/Library/Application Support/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="darwin" GOINSECURE="" GONOPROXY="" GONOSUMDB="" GOOS="darwin" GOPATH="/Users/okokes/go" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/local/Cellar/go/1.14.2_1/libexec" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/local/Cellar/go/1.14.2_1/libexec/pkg/tool/darwin_amd64" GCCGO="gccgo" AR="ar" CC="clang" CXX="clang++" CGO_ENABLED="1" GOMOD="" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/81/4jydp7kn51n6p68z88sqnkzc0000gn/T/go-build271357823=/tmp/go-build -gno-record-gcc-switches -fno-common" </pre></details>

What did you do?

I noticed my binary went from 8MB (1.14.2) to 13MB (master), but only when running du -sh or inspecting the binary in Finder (under "how much it takes on disk").

When stating both binaries, they are very similar in size, so this issue only revolves around disk usage, not size.

I replicated the issue by creating a hello world app

<details><pre> package main

import "fmt"

func main() { fmt.Println("ahoy") } </pre> </details>

And then I bisected it, starting at 181153369534c6987306c47630f9e4fbf07b467f (good) and ending at cb11c981df7b4dc40550ab71cc097c25d24d7a71 (bad).

<details><pre> package main

import ( "log" "os/exec" )

func main() { build := exec.Command("/Users/okokes/git/go/src/make.bash") build.Dir = "/Users/okokes/git/go/src" err := build.Run() if err != nil { log.Fatal("toolchain build failed", err) }

cmd := exec.Command("/Users/okokes/git/go/bin/go", "build", "src/hello.go")
err = cmd.Run()
if err != nil {
	log.Fatal("program build failed", err)
}

sz := exec.Command("du", "-sh", "hello")
out, err := sz.Output()
if err != nil {
	log.Fatal("du failed", err)
}
num := out[0]
if num != '2' {
	log.Fatal(string(out))
}

}

</pre></details>

Bisect identified 8ab37b1 as the first offending commit. I verified it manually - the commit before that leads to a 2.1MB binary on disk, this commit leads to 4.1MB on disk.

I could not replicate this on a Ubuntu 18.04 box, so I presume it's a Darwin thing.

What did you expect to see?

$ stat -f '%z %N' hello_* 2216280 hello_8ab37b1 2174008 hello_go1.14 $ du -sh hello_* 2.1M hello_8ab37b1 2.1M hello_go1.14

What did you see instead?

$ stat -f '%z %N' hello_* 2216280 hello_8ab37b1 2174008 hello_go1.14 $ du -sh hello_* 4.1M hello_8ab37b1 2.1M hello_go1.14

created time in 20 days

push eventkokes/nbviewer.js

Ondrej Kokes

commit sha 4575ee0117d0a46bde8429bf1c439b08e9514562

defer script loading

view details

push time in 21 days

pull request commentnteract/commuter

Automatic testing via GitHub Actions

  1. Yup, the stacktrace is in my actions.
  2. When installing on 10/12, I'm told npm WARN deprecated chokidar@2.1.8: Chokidar 2 will break on node v14+. Upgrade to chokidar 3 with 15x less dependencies., so that will be an issue as well. I traced this a few days ago through babel/cli and watchpack and I hit a few beta packages, so I'm not sure it's ready to be updated yet.
  3. While this issue originally popped on my local machine (macOS, Node v14.2.0), I can't quite replicate it now, it builds, though the stack trace should give us something to follow.
  4. I added a build:all to the yaml file and it passed (again, see my actions).
  5. Sadly, act (local CI) might be broken due to a parsing issue.
kokes

comment created time in 21 days

push eventkokes/commuter

Ondrej Kokes

commit sha 07404c745e3f74a45da8b7b0cba63517dd984abd

adding a build step in CI

view details

push time in 21 days

issue openedkokes/nbviewer.js

bokeh support

bokeh has very specific MIME in ipynb, see https://github.com/bokeh/jupyter_bokeh/blob/master/src/renderer.ts#L39

should we just hotload the js dependency if we see this MIME and then eval it?

we'd be starting a whole new era of potentially dangerous javascript evaluation

created time in 22 days

push eventkokes/commuter

push time in 22 days

PR opened nteract/commuter

Automatic testing via GitHub Actions

This PR introduces automated testing on each pull request and commit. This is to simplify development flows for people without suitable dev environments.

I have two questions:

  1. Tests are triggered using Node 10 and 12 (v14 failed), are there any other versions we want to test against? The README doesn't quite describe what environments are supported.
  2. Do we want to test other things? Like a production build?

Thanks!

+25 -1

0 comment

2 changed files

pr created time in 22 days

push eventkokes/commuter

Ondrej Kokes

commit sha 35e07083f93d03e492b45113d0d9142a3eda5c16

testing docs

view details

push time in 22 days

push eventkokes/commuter

Ondrej Kokes

commit sha 68ccf9a9c76a162b882a23c11830a476740b816d

basic github workflow

view details

push time in 22 days

push eventkokes/commuter

Ondrej Kokes

commit sha 3ed9c328794ec0f72683958f6cf4c0241bfc64a3

basic github workflow

view details

push time in 22 days

create barnchkokes/commuter

branch : github_actions

created branch time in 22 days

push eventkokes/od

Ondrej Kokes

commit sha 01ceee3c40398ac2721d2ca41e7d6c506de46315

[dotinfo] novy format dat

view details

push time in a month

push eventkokes/commuter

push time in a month

push eventkokes/commuter

Ondrej Kokes

commit sha b314c5cc601ec006df971d22b2b2926d5a2c20d5

action comment

view details

push time in a month

push eventkokes/commuter

Ondrej Kokes

commit sha fd19853e82fa1b073c4f7dbf126bdb48bbbe0e2c

basic github workflow

view details

push time in a month

fork kokes/commuter

🚎 Notebook sharing hub

fork in a month

push eventkokes/blog

Ondrej Kokes

commit sha 502e4126d8cc8b304941e205a00be26146b402f8

airflow post

view details

push time in a month

push eventkokes/blog

Ondrej Kokes

commit sha 9be9dc1011146478d71ea069eab42a8aa2e6bc27

airflow post

view details

push time in a month

push eventkokes/blog

Ondrej Kokes

commit sha e1bd52aa311a10000abcd40a216e4c2dcbe91864

airflow post

view details

push time in a month

issue commentgolang/go

Proposal: improve UX for major module versions

Regarding the second subproposal: What I'm wondering is whether the hard error that disallows installing of deprecated trees is future proof.

  1. There is a tutorial on using library@v1. This library is at some point deprecated in favour of v2, but this is a breaking change, the API is different and the tutorial may no longer apply. When the user reads the tutorial and tries installing library@v1, it gives them "don't install this, it's deprecated, try v2 instead" - which might confuse them as v2 is not really usable due to the breaking change, but they cannot install v1 at all - unless they dig out the last non-deprecated v1.x.y version (from where? pkg.go.dev? do they know it can be done?). I don't have a clear solution in mind, but perhaps some override or more verbose errors might be helpful here.

  2. Say I install v1.0.0 and it has a security issue, so v1.0.1 is released, fixing it (I haven't noticed, I'm still on v1.0.0). Even later on, v1.0.2 is released with a deprecated tag, because there's a v2 now. At this point, I have v1.0.0 locked in my go.(mod|sum), but there is no way for me to automatically update to a safe version with the same API (I can't use v2).

Both of these cases could use a "give me the last non-deprecated version within this major version" - because erring out does not help (in installing/in fixing a security issue) and suggestion to use a new major version doesn't either, because of the potential API changes.

Also this is sort of breaking the semver contract of "patch versions don't mess things up" - since the newest patch version cannot be installed at all - this would break tools that automate dependency security updates (think dependabot, snyk) - when a new patch version gets released (with just a deprecation tag), the tool that tries to suggest an update fails in a weird way. Maybe I'm misunderstanding what happens when go get -u pkg gets run.

I hope I understood the flows correctly. In any case, thanks for this proposal, I love UX improvements.

adg

comment created time in a month

issue commentcesko-digital/obce

Wikidata jako zdroj

Ty duplicity tam vznikaj zpravidla tak, že máš třeba dva erby nebo dva zdroje pro webovku, takže jelikož to převádí do tabulkového formátu, tak ti to roznásobí danej záznam do n řádků. Řešil jsem to mnohokrát, nikdy uspokojivě :-)

kokes

comment created time in a month

issue commentcesko-digital/obce

Wikidata jako zdroj

Jj, wikidata jsou príma, jen to teda neni primární zdroj, takže bych to bral jen na věci, který odjinud snadno nedostanem (např. ty erby nebo webovky) a nestahoval bych třeba RUIAN info.

Kdyby bylo potřeba pomoct s nějakejma specifikama tamních datových modelů, tak je na to super Facebooková skupina Wikidata CS.

kokes

comment created time in a month

issue commentcesko-digital/obce

Přidání starosty/primátora

OK, to bude použitelnější, jen pozor na pár věcí:

  1. Jméno není unikátní identifikátor
  2. Je rozdíl mezi stranou navrhující a stranickou příslušností (asi záleží, na co to budete mít)
  3. Jsou tam různé koalice docela komplexně namodelované, já se na tom už mnohokrát spálil - hlavní problém mi teda dělalo odlišit strany navrhující, strany členství a ještě jeden idenfikátor strany (je jich tam kupodivu hodně)

Každopádně proč je potřeba ta politická strana? Je to pro něco konkrétního relevantní?

MJ-sys

comment created time in a month

issue commentcesko-digital/obce

Přidání starosty/primátora

  1. Volby nereflektují aktuální stav
  2. Volby nereflektují žádný stav (primátor nemusí být z vítězné strany - viz třeba Brno nebo vlastně i Praha)
MJ-sys

comment created time in a month

issue openedcesko-digital/obce

Wikidata jako zdroj

Ahoj, v rámci Wikidat (strukturované DB, která částečně pohání Wikipedii) můžete najít spoustu ozdrojovaných informací. Je tam navíc link na ostatní systémy (např. IČO), takže to jde snadno prolinkovat s daty z OVM nebo odjinud.

Asi bych tam netahal souřadnice (tam bych místo API Talks bral RUIAN napřímo, je to stabilnější zdroj), ale třeba ty erby tam jsou.

Příklad query: https://w.wiki/PKc

created time in a month

pull request commentsimple-salesforce/simple-salesforce

Lazy loading

Ended up rebasing on the current master, so it's all current - the only conflicts were in tests (with all the u' stuff, so I resolved that and that resolved the points you raised as well.

kokes

comment created time in a month

Pull request review commentsimple-salesforce/simple-salesforce

Lazy loading

 def test_query_all_include_deleted(self):             OrderedDict([(u'records', [                 OrderedDict([(u'ID', u'1')]),                 OrderedDict([(u'ID', u'2')])-            ]), (u'done', True)]))+            ]), (u'done', True), (u'totalSize', 2)]))

rebased on the current master and resolved this

kokes

comment created time in a month

Pull request review commentsimple-salesforce/simple-salesforce

Lazy loading

 def test_query_all(self):             OrderedDict([(u'records', [                 OrderedDict([(u'ID', u'1')]),                 OrderedDict([(u'ID', u'2')])-            ]), (u'done', True)]))+            ]), (u'done', True), (u'totalSize', 2)]))

rebased on the current master and resolved this

kokes

comment created time in a month

push eventkokes/simple-salesforce

JonWobken

commit sha 20602bac006df216f255667ed86b0f3bf390372b

Update __version__.py Update version to 74.4

view details

JonWobken

commit sha cf40347da665353c0db69a4a0f88fde72360f350

Update CHANGES

view details

JonWobken

commit sha 881356c0dac5e770870a50b87860aed572927bfe

Update __version__.py Update version

view details

JonWobken

commit sha 26cf7e981dc5e9b3c809264df9ff14739bc519e6

Update __version__.py Update maintainer/maintainer email

view details

JonWobken

commit sha 9f9f7b0d5feb09fa5d20680c6775ec605eab2124

Update .travis.yml

view details

JonWobken

commit sha ce320e0663c9bbf483f36e20429c567ea09e08f5

Update .travis.yml

view details

JonWobken

commit sha d9b53c67df7d84c9ed65f1b9fbb26654f211d35d

Update .travis.yml

view details

Nick Catalano

commit sha df928f03c62861da70915fff956f60b2061720d5

Add latest travis password

view details

Nick Catalano

commit sha 743cff17a39ec33d5eeacbd40a07b301632795f0

Travis Improvements (#357) Closes #317 Closes #254 * Test against Python 3.7 and 3.8, disable conflicting deploys * No longer test unsupported versions that fail (except 2.7) * Update README to reflect change to travis-ci.com * Use latest version of pylint to let tests pass * Fix some pylint/requirements problems * Do not lint using pylint for pypy or python 2.7 * Only run linting on one python version

view details

Nick Catalano

commit sha 1ebee0085b27ab5f9181a5e691fb8dc9fdc90163

1.0 Production Release (#364) * Remove depreciated `SalesforceAPI` class * Bumping default API version to 42.0 * Remove requests[security] and replace with requests >= 2.22.0 * Remove depreciated sandbox attribute * Remove some trailing spaces * Only support currently supported versions of python * Run isort on codebase to sort imports * Fix failing tests * Fix listing problems / style issues * List changes in CHANGES, change classifier, bump version * Refactor travis to address warnings Closes #365 * Bit of cleanup for code readability * Remove unnecessary python2.x `u` from strings, normalize quotations

view details

Ondrej Kokes

commit sha 06a133f45a55ff737017111609cfb82f67797726

lazy loading in query_all

view details

Ondrej Kokes

commit sha 81efdb61f8833b5aeaeb245c80a29dd8c99c966b

docs

view details

Ondrej Kokes

commit sha b6c16f61cab5c97eb0117057547454a3e5482296

python 2 compatibility

view details

Ondrej Kokes

commit sha 6537a5b648a5094e288b63d06cd12aae7018bb47

linting

view details

Ondrej Kokes

commit sha f242ad17c0a4b1da9dc89c2fd8bbef84b553a5fc

typo

view details

Ondrej Kokes

commit sha aace6ef0393ede4f86e9651c4d5b1e7cf73c296a

PR number in CHANGES

view details

Ondrej Kokes

commit sha 4484e48dd60f23eae00b7e06692b1ff5e64ca9e1

tests that include totalSize

view details

Ondrej Kokes

commit sha b476f31e47813822f04f15cdf9a3568ef911eb1e

totalSize implementation

view details

Ondrej Kokes

commit sha a8c500981066cff4f885a189effec4fc4643a844

appeasing the linting gods

view details

push time in a month

push eventkokes/od

Ondrej Kokes

commit sha 55642a066fbdbef116df9b9795e08ed5dbe8102b

[eufondy] mirny refaktoring, funguje bez inicializace Skripty si ted stahnou data, ktera pro obdobi 2007-2013 jsou vzdy ta nejnovejsi (protoze novejsi uz nebudou), pro 2014-2020 to stahne ta v tuto chvili aktualni (04/2020), dal si to bude muset clovek aktualizovat rucne.

view details

push time in a month

push eventkokes/od

Ondrej Kokes

commit sha 177beafccfbf9b7837692915185de11d99802448

[udhpsh] helper script

view details

push time in a month

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 028ac35bc9130f3972eb904ea3970ad103a74381

doplnujici info ke traktorum

view details

push time in a month

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 6258c0a915b7a13c194303f3bc27d907015e0b18

koňské délky

view details

push time in a month

push eventkokes/vaclavaky

Ondrej Kokes

commit sha ea1d7dd55f8d5b09a0f5a538a707ead2d3e0968e

lidske vlasy

view details

push time in 2 months

issue openedkokes/vaclavaky

Seznamy z Wikipedie

Na wiki jsou dvě stránky z nezvyklými jednotkami, projít to a vybrat to, co se nám hodí.

  • https://en.wikipedia.org/wiki/List_of_humorous_units_of_measurement
  • https://en.wikipedia.org/wiki/List_of_unusual_units_of_measurement

created time in 2 months

push eventkokes/vaclavaky

Ondrej Kokes

commit sha d58e7884089eeb554d7747700d503fe773b8c3f0

nove forexy

view details

Ondrej Kokes

commit sha 8fb24689bb1deeb5d16bcf4099b9505587873cf6

README a licence

view details

push time in 2 months

push eventkokes/vaclavaky

hlad

commit sha bd18ec579e73f3ea343c75b791f744120984dd1d

opravena hodnota u sekundy

view details

push time in 2 months

PR merged kokes/vaclavaky

opravena hodnota u sekundy

opravena hodnota sekundy z 1 na 1/60

+1 -1

1 comment

1 changed file

hlad

pr closed time in 2 months

pull request commentkokes/vaclavaky

opravena hodnota u sekundy

🤦‍♂ děkuji

hlad

comment created time in 2 months

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 8ef760c02ddfc1359de63a7dd81e2886a0aef8cb

PR template

view details

push time in 2 months

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 42603400c76ffdc0644a0e17487cad42c7fe722c

lepsi zpracovani jednotek - vysvetlivky k dimenzim

view details

push time in 2 months

push eventkokes/vaclavaky

Jan Piskvor Martinec

commit sha 2f31371ba3c7ca92bc19b1236e8edd55f4645de6

"udržujte vzdálenost jednoho tapíra nadél"

view details

push time in 2 months

PR merged kokes/vaclavaky

"udržujte vzdálenost jednoho tapíra nadél"

Pokud je ve srovnáních finský maakotka, jistě by neměl chybět ani tapír. (Zdroj měření: Bionag Guatemala, https://www.facebook.com/bionag/photos/a.1517805361845838/2326913190935047/?type=3&theater )

+4 -0

1 comment

1 changed file

Piskvor

pr closed time in 2 months

pull request commentkokes/vaclavaky

"udržujte vzdálenost jednoho tapíra nadél"

Obvykle u takových měření dávám i hmotnost (viz třeba plejtváci), ale tady je to rozpětí u tapírů tak velký, že to asi nemá smysl (stejně jako jsem to nedělal včera u orlů).

Takže cajk, díky.

Piskvor

comment created time in 2 months

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 460226715f30e4de082348f4fc34f9d68b91bb71

toaletaky

view details

push time in 2 months

push eventkokes/vaclavaky

Ondrej Kokes

commit sha 18ebe35fec41d5e1de52c6d01af7361ff6da0d54

orel skalni

view details

push time in 2 months

push eventkokes/blog

Ondrej Kokes

commit sha a791bd75b7637912cc7aff4a223bd733b5fff2c1

mba post

view details

push time in 2 months

issue openedheroku/salesforce-bulk

Deprecate in favour of simple-salesforce

Hey! I remember using salesforce-bulk way back in the day, it was great, but it doesn't seem that updated anymore. We've been using simple-salesforce in production for a number of years and it's been very solid and has continued support (1.0 is planned - https://github.com/simple-salesforce/simple-salesforce/pull/364). Also, it goes beyond just bulk (it does support it - https://github.com/simple-salesforce/simple-salesforce/commit/e9433e64cc3eef0c7a177aee1d8958a1c324f39d), it has a lot of SFDC introspection, non-bulk APIs etc.

Do you think that for the sake of consolidation in this space, you might consider archiving this in favour of simple-salesforce? Or, if not, somehow describe this library's future in the docs, so that it's clearer to its users?

cc @lambacck

created time in 3 months

issue commentsimple-salesforce/simple-salesforce

Increase the default Salesforce API Version

Yes to this. We've bumped into this issue, where some records were not returned for an older version of the API version. Can't remember the specifics, but it was a weird edge case. Bumping the version in our connstring solved the problem, so I'm all for having a more sensible default.

nickcatal

comment created time in 3 months

push eventPyDataCZ/pydata.cz

Jan Pipek

commit sha a6c75e1c07bdfc463bff057ca27b4a45377b258f

Diar-slides, Andrej-video

view details

push time in 3 months

PR merged PyDataCZ/pydata.cz

Diar-slides, Andrej-video
+2 -2

0 comment

2 changed files

janpipek

pr closed time in 3 months

push eventkokes/pydantic

Ondrej Kokes

commit sha 3d7f1f81be4efa9c171187d0d24263326a5bb181

Update docs/examples/index_main.py explicit assignment for an optional field Co-Authored-By: Samuel Colvin <samcolvin@gmail.com>

view details

push time in 3 months

push eventkokes/pydantic

Ondrej Kokes

commit sha 65b2bab641b27bbe13947d4f22626a62722bd704

Update README.md explicit assignment for an optional field Co-Authored-By: Samuel Colvin <samcolvin@gmail.com>

view details

push time in 3 months

more