profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/grosser/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Michael Grosser grosser Zendesk.com San Francisco, CA grosser.it Mostly Ruby/Rails hacker that tries to OS everything.

burke/zeus 3317

Boot any rails app in under a second.

ambethia/recaptcha 1775

ReCaptcha helpers for ruby apps

fakefs/fakefs 998

A fake filesystem. Use it in your tests.

gregorym/bump 149

Bump is a gem that will simplify the way you build gems.

dblock/rspec-rerun 94

Re-run (retry) failed RSpec examples.

grosser/acts_as_feed 23

Rails/AR: Transform a Model into a Feed Representation (Feed Reader)

anamartinez/large_object_store 22

Store large objects in memcache or others by slicing them.

bitbckt/resque-lifecycle 15

Lifecycle management for Resque jobs

grosser/after_commit_exception_notification 15

Rails: Get notified when an after_commit block blows up

adammw/kinesiscat 9

Netcat for AWS Kinesis Data Streams

startedmatryer/bitbar

started time in 3 hours

startedmatryer/bitbar-plugins

started time in 3 hours

fork spajic/parallel_tests

Ruby: 2 CPUs = 2x Testing Speed for RSpec, Test::Unit and Cucumber

fork in 5 hours

startedgrosser/preoomkiller

started time in 15 hours

startedgrosser/parallel_tests

started time in 19 hours

startedgrosser/parallel_split_test

started time in a day

pull request commentgregorym/bump

make travis faster

Sorry for the interruption again, @grosser, and thank you for your gentle response.

grosser

comment created time in a day

pull request commentgregorym/bump

make travis faster

Hello,

Sorry, I know it's been a long time ago, but I just noticed that Travis cache: was enabled by this pull request, whereas Travis caching feature was available since December 17, 2014 (about a year before). I am interested in understanding the reason behind delaying the cache feature in this repository.

Thank you.

grosser

comment created time in a day

startedlogseq/logseq

started time in a day

pull request commentpremailer/premailer

cleanup

I got your point. Thank you so much, @grosser, for your gentle responsiveness.

grosser

comment created time in a day

pull request commentpremailer/premailer

cleanup

Hi @grosser, Thanks for your reply and sorry about such interruption.

I just noticed that cache: was enabled for Travis builds in this pull request, whereas Travis caching feature was available since December 17, 2014 (more than a year earlier). I am interested in understanding the reason behind delaying the cache feature.

Thank you.

grosser

comment created time in a day

pull request commentgrosser/parallel

test

Hello,

Sorry, I know it's been a long time ago, but I am curious why caching hasn't been enabled before this (i.e., since December 17, 2014, when it became available to open source projects).

Thank you.

grosser

comment created time in a day

pull request commentpremailer/premailer

cleanup

Hello,

Sorry, I know it's been a long time ago, but I am curious why caching hasn't been enabled before this (i.e., since December 17, 2014, when it became available to open source projects).

Thank you.

grosser

comment created time in a day

GollumEvent

startedgrosser/parallel

started time in a day

startedgrosser/parallel

started time in 2 days

GollumEvent
GollumEvent

issue commentgrosser/parallel

Segmentation fault with mysql2 and in_processes

So i updated the mysql and ruby version

gem 'mysql2', '~> 0.5.3'
ruby:2.6.0

and ran the code with 4 processes Parallel.map(ranges, :in_processes => 4 ) do |range| the workers ran for some time but slowly they got killed one by one with the below errors. But yeah they ran for much longer than yesterday when i was on older gem versions.

E0303 13:41:01.578019480       9 ssl_transport_security.cc:510] Corruption detected.
E0303 13:41:01.578086834       9 ssl_transport_security.cc:486] error:100003fc:SSL routines:OPENSSL_internal:SSLV3_ALERT_BAD_RECORD_MAC
E0303 13:41:01.578096288       9 secure_endpoint.cc:208]     Decryption error: TSI_DATA_CORRUPTED
E0303 13:41:01.579091347      36 ssl_transport_security.cc:510] Corruption detected.
E0303 13:41:01.579368866      36 ssl_transport_security.cc:486] error:100003fc:SSL routines:OPENSSL_internal:SSLV3_ALERT_BAD_RECORD_MAC
E0303 13:41:01.579534771      36 secure_endpoint.cc:208]     Decryption error: TSI_DATA_CORRUPTED
E0303 13:41:01.579499119      48 ssl_transport_security.cc:510] Corruption detected.
E0303 13:41:01.580041527      48 ssl_transport_security.cc:486] error:100003fc:SSL routines:OPENSSL_internal:SSLV3_ALERT_BAD_RECORD_MAC
E0303 13:41:01.580167078      48 secure_endpoint.cc:208]     Decryption error: TSI_DATA_CORRUPTED
E0303 13:41:01.580940403      78 ssl_transport_security.cc:510] Corruption detected.
E0303 13:41:01.581021246      78 ssl_transport_security.cc:486] error:100003fc:SSL routines:OPENSSL_internal:SSLV3_ALERT_BAD_RECORD_MAC
E0303 13:41:01.581055435      78 secure_endpoint.cc:208]     Decryption error: TSI_DATA_CORRUPTED
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/ivar.rb:169: [BUG] Segmentation fault at 0x00007efe599b2270
ruby 2.6.0p0 (2018-12-25 revision 66547) [x86_64-linux]

-- Control frame information -----------------------------------------------
c:0010 p:0028 s:0055 e:000051 METHOD /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/ivar.rb:169
c:0009 p:0010 s:0043 e:000042 METHOD /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/scheduled_task.rb:285
c:0008 p:0006 s:0039 e:000038 BLOCK  /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/timer_set.rb:165
c:0007 p:0009 s:0036 e:000035 METHOD /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:363
c:0006 p:0049 s:0028 e:000027 BLOCK  /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:352 [FINISH]
c:0005 p:---- s:0022 e:000021 CFUNC  :loop
c:0004 p:0006 s:0018 e:000017 BLOCK  /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:335 [FINISH]
c:0003 p:---- s:0015 e:000014 CFUNC  :catch
c:0002 p:0020 s:0010 e:000009 BLOCK  /usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:334 [FINISH]
c:0001 p:---- s:0003 e:000002 (none) [FINISH]

-- Ruby level backtrace information ----------------------------------------
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:334:in `block in create_worker'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:334:in `catch'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:335:in `block (2 levels) in create_worker'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:335:in `loop'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:352:in `block (3 levels) in create_worker'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/ruby_thread_pool_executor.rb:363:in `run_task'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/executor/timer_set.rb:165:in `block (2 levels) in process_tasks'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/scheduled_task.rb:285:in `process_task'
/usr/local/bundle/gems/concurrent-ruby-1.1.8/lib/concurrent-ruby/concurrent/ivar.rb:169:in `safe_execute'

-- Machine register context ------------------------------------------------
 RIP: 0x00007efe8f8043c5 RBP: 0x0000000000000009 RSP: 0x00007efe942d99a8
 RAX: 0x00005555db9a2d94 RBX: 0x0000000000000003 RCX: 0x0000000000000001
 RDX: 0x0000000000000002 RDI: 0x00007efe599b2270 RSI: 0x0000000000000000
  R8: 0x00000000283ceef8  R9: 0x000000000002cc54 R10: 0x00000000ffffffff
 R11: 0x0000000000000000 R12: 0x0000000000000005 R13: 0x00007efe599b2230
 R14: 0x0000000000000000 R15: 0x00005555db9a2d70 EFL: 0x0000000000010202

Any thoughts on this?

abhinav-94

comment created time in 2 days

issue openedambethia/recaptcha

TypeError with oddly shaped "g-recaptcha-response" params

Hi there,

I investigated some server errors and found exceptions coming from Recaptcha TypeError: no implicit conversion from nil to integer

gems/recaptcha-5.3.0/lib/recaptcha/adapters/controller_methods.rb:80:in `recaptcha_response_token': no implicit conversion from nil to integer (TypeError)
from gems/recaptcha-5.3.0/lib/recaptcha/adapters/controller_methods.rb:16:in `verify_recaptcha'

The g-recaptcha-response param looks suspicious and seems to be submitted by a spammer:

...
"g-recaptcha-response": [
  "\n                "
],
...

I think it would be great if Recaptcha would handle this and sanitize the param if it's an (empty) array with Recaptcha v2.

I am using Recaptcha v2 with gem version 5.3.0 but I believe this is still an issue in 5.7.0

created time in 2 days

starteddaptin/daptin

started time in 2 days

startedgrosser/parallel

started time in 3 days

startednowaalex/af-virtual-scroll

started time in 4 days

startedgrosser/parallel_tests

started time in 4 days

issue openedgrosser/parallel

Segmentation fault in Parallel (in_processes)

Below are my gem versions -

gem 'rails', '5.1.1'
gem 'mysql2'
ruby:2.5.0

I am trying to utilise the multiple cores available, using Parallel gem to do a heavy time consuming task.

Parallel.map(ranges, :in_processes =>2) do |range|
       require 'parallel'
       @items_dao.reconnect
       #doStuff
       0  
     end

But I keep getting Segmentation fault -

/usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/mysql2.rb:159: [BUG] Segmentation fault at 0x00007f1cb91518b8
ruby 2.5.0p0 (2017-12-25 revision 61468) [x86_64-linux]

-- Control frame information -----------------------------------------------
c:0069 p:---- s:0427 e:000426 CFUNC  :each
c:0068 p:0036 s:0422 e:000421 BLOCK  /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/mysql2.rb:159
c:0067 p:0085 s:0418 E:001d50 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/mysql2.rb:90
c:0066 p:0014 s:0406 e:000405 BLOCK  /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34
c:0065 p:0005 s:0402 e:000401 BLOCK  /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/database/connecting.rb:250
c:0064 p:0030 s:0398 e:000397 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/connection_pool/threaded.rb:85
c:0063 p:0018 s:0387 e:000386 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/database/connecting.rb:250
c:0062 p:0068 s:0382 e:000381 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34
c:0061 p:0058 s:0375 e:000374 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/dataset/actions.rb:908
c:0060 p:0040 s:0367 e:000366 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/mysql2.rb:191
c:0059 p:0007 s:0360 E:001dc8 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/adapters/mysql2.rb:153
c:0058 p:0029 s:0355 E:0022e8 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/dataset/actions.rb:139
c:0057 p:0100 s:0350 E:000fb8 METHOD /usr/local/bundle/gems/sequel-4.18.0/lib/sequel/dataset/actions.rb:740

Stack trace is really big, all the gems get listed at the last it is Error: Parallel::DeadWorker

If I use Parallel.map(ranges, :in_processes =>1) then also i get the same error

If i use Parallel.map(ranges, :in_threads => 2 ) do |range| instead of 'in_processes', then it works fine, but it just uses one core.

I have read at multiple places that mostly its the mysql version, but I have tried that also, i just get a different gem with Segmentation fault. Let me know if more information is required.

created time in 4 days

startedgrosser/parallel

started time in 4 days

issue commentgrosser/parallel_tests

Parallel tests on Heroku CI

Although this approach works perfectly, it doesn't make use of multiple CPUs in performance dynos and runs only a single process in each dyno. To be able to run multiple processes I took a different approach.

As others have mentioned, Heroku builds a default database and sets DATABASE_URL, which overrides your database.yml configuration. This causes errors in rake parallel:setup. So instead, I unset the DATABASE_URL and clone the default database in a bash script. Then I split the tests in according to number of CPUs and the number of dynos.

Here is my configuration:

// app.json
{
  ...
  "environments": {
    "test": {
      "formation": {
        "test": {
          "quantity": 4,
          "size": "performance-l"
        }
      },
      "env": {  "POSTGRESQL_VERSION": "11"  },
      "addons": ["heroku-postgresql:in-dyno", "heroku-redis:in-dyno"],
      "scripts": {
        "test": "bash ./script/test"
      }
    }
  }
}
# test/script

#!/bin/bash

concurrency=$( nproc --all )
echo "-----> Found ${concurrency} CPUs"

spec_node_count=$(( $CI_NODE_TOTAL ))
process_count=$(( $spec_node_count * $concurrency ))

# Heroku buildpack creates a default database and sets $DATABASE_URL pointing to that database
# That means tests running on Heroku CI will ignore database.yml settings
# Also running rake parallel:setup somehow can not create the databases and causes errors
#
# The script below simply clones the default database provided by Heroku
# and then splits the tests into groups so that we can have as many processes as the number of CPUs

regex="^postgres:\/\/([a-zA-Z0-9]+):([a-zA-Z0-9]+)@([a-zA-Z0-9]+):([0-9]+)/([a-zA-Z0-9_]+)"

if [[ $DATABASE_URL =~ $regex ]]
then
  export DB_USERNAME="${BASH_REMATCH[1]}" # random value created by the buildpack
  export DB_PASSWORD="${BASH_REMATCH[2]}" # random value created by the buildpack
  export DB_HOST="${BASH_REMATCH[3]}" # most probably localhost
  export DB_PORT="${BASH_REMATCH[4]}" # most probably 5432

  buildpack_db_name="${BASH_REMATCH[5]}" # most probably postgres_buildpack_db
  export DB_NAME_PREFIX=$buildpack_db_name

  unset DATABASE_URL # we unset this, so that Rails uses database.yml settings

  # Create new databases using the default database as the template. One database per process
  # This is similar to rake parallel:setup
  # It should create databases such as postgres_buildpack_db2, postgres_buildpack_db3, ...
  echo "-----> Creating test databases for paralel testing"

  for (( idx=2; idx<=$concurrency; idx++ ))
  do
    createdb -O $DB_USERNAME -T $buildpack_db_name ${DB_NAME_PREFIX}${idx}
    echo "       Database ${DB_NAME_PREFIX}${idx} created"
  done

  # Find out which groups will be run on this machine
  # e.g. With 2 performance-l dynos,
  # dyno 1 should run groups 0,1,2,3,4,5,6,7
  # dyno 2 should run groups 8,9,10,11,12,13,14,15
  process_index_start=$(( $concurrency * $CI_NODE_INDEX ))
  process_index_end=$(( process_index_start + concurrency ))

  groups=$process_index_start
  for (( idx=$process_index_start + 1; idx<$process_index_end; idx++ ))
  do
    groups+=",${idx}"
  done

  echo "-----> Running groups ${groups} from ${process_count} processes"
  bundle exec parallel_rspec spec/ -n $process_count --only-group $groups
else
    echo "Unexpected DATABASE_URL format"
fi
# database.yml

test:
  <<: *default
  database: <%= ENV['DB_NAME_PREFIX'] || 'parasut_test' %><%= ENV['TEST_ENV_NUMBER'] %>
  pool: 5
  host: <%= ENV['DB_HOST'] || '127.0.0.1' %>
  port: <%= ENV['DB_PORT'] || 5432 %>
  username: <%= ENV['DB_USERNAME'] %>
  password: <%= ENV['DB_PASSWORD'] %>
joecorkerton

comment created time in 4 days

issue commentambethia/recaptcha

recaptcha_v3 causing Content Security Policy: Ignoring “'unsafe-inline'”

No, I wish I did, otherwise I would create a PR for it.

yoshie902a

comment created time in 4 days

PR opened grosser/unicorn_wrangler

Instrument using distribution instead of histogram

@grosser @craig-day

This will break non-datadog statsd clients, however...

+6 -6

0 comment

2 changed files

pr created time in 4 days

startedtw-in-js/twind

started time in 4 days