profile
viewpoint
Ben Wilson benwilson512 Baltimore, MD

akira/exq 1203

Job processing library for Elixir - compatible with Resque / Sidekiq

benwilson512/ar-transmogrifier 6

Convert your ActiveRecord schema.rb into Ecto Models.

benwilson512/awesome-elixir 1

A curated list of amazingly awesome Elixir and Erlang libraries, resources and shiny things. Updates:

benwilson512/absinthe_benchmark 0

super basic code to profile absinthe with

benwilson512/amnesia 0

Mnesia wrapper for Elixir.

benwilson512/apollo-tracing-elixir 0

Apollo Tracing middleware for Absinthe

PR closed absinthe-graphql/absinthe

Copy logger metadata for async and batch

Hi folks! Thanks for creating and maintaining an awesome library. I debated opening an issue for this first, but decided it's a simple enough change that I would just create a PR. I'm happy to rework it or scrap it if that's what's decided.

Currently whenever the async or batch middleware is used, any logger metadata is lost when the new async tasks are created. This means that the Phoenix request ID doesn't show in log lines if using Phoenix, and it means that a user ID or other metadata needs to be added in every resolver to be present for logs.

This copies the logger metadata from the parent process whenever a new async task is created so that any logs further down the stack will have access to it.

Thanks for reviewing! I look forward to any feedback 😄

+70 -2

4 comments

5 changed files

dnsbty

pr closed time in 2 days

pull request commentabsinthe-graphql/absinthe

Copy logger metadata for async and batch

I don't think this is something we should do in Absinthe. I am advocating for a general context propagation solution in erlang

I also realized that you can do this much easier yourself.. the Async middleware can take in a task you've defined yourself, so you can take this example and tweak it just a bit:

https://github.com/absinthe-graphql/absinthe/blob/d83a1ff439f9d52575e34940df492528738d038a/test/absinthe/middleware/async_test.exs#L41-L50

dnsbty

comment created time in 2 days

issue commentabsinthe-graphql/absinthe

Change to imported module does not recompile outer module

I think something's up with how i tested it. the solution above does seem to work somehow, although only Boo.B gets recompiled somehow but not A. :confu

> iex -S mix
Erlang/OTP 23 [erts-11.1.5] [source] [64-bit] [smp:8:8] [ds:8:8:10] [async-threads:1] [hipe]

Interactive Elixir (1.11.3) - press Ctrl+C to exit (type h() ENTER for help)
iex(1)> A.Compiled.__absinthe_type__(:new_type) == nil
false
iex(2)> recompile
Compiling 1 file (.ex)
:ok
iex(3)> A.Compiled.__absinthe_type__(:new_type) == nil
true
dylan-chong

comment created time in 3 days

pull request commentabsinthe-graphql/absinthe_plug

Set all plug options via put_options

Released in 1.5.4

binaryseed

comment created time in 3 days

release absinthe-graphql/absinthe_plug

v1.5.4

released time in 3 days

release absinthe-graphql/absinthe_plug

v1.5.3

released time in 3 days

created tagabsinthe-graphql/absinthe_plug

tagv1.5.3

Plug support for Absinthe, the GraphQL toolkit for Elixir

created time in 3 days

created tagabsinthe-graphql/absinthe_plug

tagv1.5.4

Plug support for Absinthe, the GraphQL toolkit for Elixir

created time in 3 days

push eventabsinthe-graphql/absinthe_plug

Vince Foley

commit sha d1f1c61feb4856bce81fc91443878aa70812ef09

Bump version 1.5.4

view details

push time in 3 days

PR merged absinthe-graphql/absinthe_plug

Set all plug options via put_options

This PR wires up the ability to set complexity settings via put_options

@benwilson512 These make sense to set on a per-request basis, but I'm not sure if the other options do...

closes #241

+94 -1

2 comments

3 changed files

binaryseed

pr closed time in 3 days

push eventabsinthe-graphql/absinthe_plug

Vince Foley

commit sha 8d06e0bdb1f58983ddc622c7542985a5df8a94cc

Set complexity via put_options

view details

Vince Foley

commit sha f1f2801148292afa60b21dec6876efca95812986

Set all options

view details

Vince Foley

commit sha cd56b64c77dc2a07565056d9c897a191790c3e66

Changelog note

view details

Vince Foley

commit sha 2525d4ca1550853d733c9fec0817dbb45df10ece

Merge pull request #243 from binaryseed/complexity-via-put_options Set all plug options via put_options

view details

push time in 3 days

issue closedabsinthe-graphql/absinthe_plug

Dynamically generated max complexity

Hello! 👋 First of all, thank you for Absinthe and Absinthe.Plug!

I'm wondering if the maintainers would be open to the addition of a feature to dynamically determine the max complexity per query. I'd like to be able to say something like, "User X made a query, they have Y complexity allowance left, will I allow them to make this query?"

Of course this doesn't match the primary use case. 😄

Whereas the current API looks like this:

plug Absinthe.Plug,
  schema: MyAppWeb.Schema,
  analyze_complexity: true,
  max_complexity: 50

I would like to propose expanding the API to support this as well:

plug Absinthe.Plug,
  schema: MyAppWeb.Schema,
  analyze_complexity: true,
  max_complexity: &MyModule.max_complexity/1

Where &MyModule.max_complexity/1 is a function that is given the context and query and returns back an integer.

Thank you for considering! I would be glad to whip up a PR if you're open to this. Cheers!

closed time in 3 days

paulstatezny

pull request commentabsinthe-graphql/absinthe_plug

Set all plug options via put_options

Ok, wired it up, mind giving it a quick review?

binaryseed

comment created time in 4 days

issue commentabsinthe-graphql/absinthe

Change to imported module does not recompile outer module

Ah, just added import_types Boo.{B} # Note that the {} is required

dylan-chong

comment created time in 4 days

issue openedabsinthe-graphql/absinthe

[BAD BUG] Change to imported module does not recompile outer module

Steps:

  1. Make schema A import types from module B
  2. Compile and run your app with an iex shell open
  3. Add a new type to schema B called :new_type
  4. type recompile into the iex shell
  5. Run A.Compiled.__absinthe_type__(:new_type) which should return a non nil value. BUT it returns nil!

created time in 4 days

issue commentabsinthe-graphql/absinthe

Macro-schema argument default values are not rendered in SDL

FYI a little related - there's been other conversation about default_value / nil: https://github.com/absinthe-graphql/absinthe/issues/656

maartenvanvliet

comment created time in 5 days

issue commentabsinthe-graphql/absinthe

Macro-schema argument default values are not rendered in SDL

@benwilson512 Thoughts on this issue?

maartenvanvliet

comment created time in 5 days

push eventabsinthe-graphql/absinthe

Vince Foley

commit sha 6b136faf0d7f06c1994a25547f34384b6c2adac9

Changelog entry

view details

push time in 5 days

push eventabsinthe-graphql/absinthe

Maarten van Vliet

commit sha ffb5e12b7d45ca9e33d2732cc59111fa7c7adadf

Render null default values in SDL (#1032)

view details

push time in 5 days

PR merged absinthe-graphql/absinthe

Render null default values in SDL

Literal null default values in arguments in the SDL notation triggered errors.

  • A pattern match was missing for the Language.NullValue when converting to Blueprint.Schema.InputValueDefinition
  • In the SDL rendered the Blueprint.Input.Null was not matched when in render_value
+7 -0

1 comment

3 changed files

maartenvanvliet

pr closed time in 5 days

pull request commentabsinthe-graphql/absinthe

Render null default values in SDL

Thanks!

maartenvanvliet

comment created time in 5 days

push eventabsinthe-graphql/absinthe

Ben Wilson

commit sha fd567c7d1f3127e40905ab5bc5ff4f15f571c759

Improved serialization failure messages (#1033) * improved error messages on serialization failure * tweak wording * changelog entry

view details

push time in 5 days

delete branch absinthe-graphql/absinthe

delete branch : nicer-serialization-errors

delete time in 5 days

PR merged absinthe-graphql/absinthe

Improved serialization failure messages

This changes the relatively unhelpful message we have today:

** (Absinthe.SerializationError) Value 1.0 is not a valid integer

into something that actually tells you what field to go look for to fix the issue:

** (Absinthe.SerializationError) Could not serialize term 1.0 as type Int
When serializing the field:
RootQueryType.bad_integer (/path/to/schema/module.ex:8)
+17 -1

0 comment

2 changed files

benwilson512

pr closed time in 5 days

Pull request review commentabsinthe-graphql/absinthe

Improved serialization failure messages

 # Changelog +## 1.6.1++- Feature: [Improved serialization failure messages](https://github.com/absinthe-graphql/absinthe/pull/1033)

The only tricky bit is the lack of a PR # before submitting :)

benwilson512

comment created time in 5 days

issue openedabsinthe-graphql/absinthe_relay

ParseIDs middleware cannot find schema_node on subsequent `input` field

I believe that we have encountered a bug with the ParseIDs middleware and it may be related to #72

I have a definition that roughly mirrors this:

  input_object(:post_comments_input) do
    field(:posted_by_user_id, :id)
  end

  connection(:comments, node_type: :comment) do
    field(:total_count, non_null(:integer), resolve: &count_resolver/3)
    edge(do: nil)
  end

  node object(:comment) do
    field(:body, non_null(:string))
  end

  node object(:post) do
    connection field(:comments, node_type: :comments) do
      # I like to put query args in an input object because I can deprecate the fields
      arg(:input, :post_comments_input)

      # This is the problem area when calling updatePost
      middleware(Absinthe.Relay.Node.ParseIDs, input: [posted_by_user_id: :user])
      resolve(&resolve_it/2)
    end
  end

  payload field(:update_post) do
    middleware(Absinthe.Relay.Node.ParseIDs, post_id: :post)

    input do
      field(:post_id, :id)
      field(:body, :string)
    end

    output do
      field(:post, non_null(:post))
    end

    reslovlve(&upate_it/2)
  end

When I call the node query with posts there are no problems:

  query TestQuery($input:PostCommentsInput,$id:ID!) {
    post: node(id:$id) {
      ... on Post {
        comments(first:0,input:$input) {
          totalCount
        }
      }
    }
  }

The problem I am seeing happens when I use the updatePost mutation and then resolve the Post.comments connection like so:

  mutation TestMutation($input:UpdatePostInput!) {
    updatePost(input:$input) {
      post {
        comments(first:0) {
          totalCount
        }
      }
    }
  }

I get an error returned that states:

{
  "data": {
    "updatePost": {
      "post": {
        "comments": null
      }
    }
  },
  "errors": [
    {
      "message": "Could not find schema_node for input",
      "path": [
        "updatePost",
        "post",
        "comments"
      ]
    }
  ]
}

I have circumvented this problem by not using the ParseIDs middleware and manually parsing the IDs, but it is not a scalable solution. I am planning to write custom middleware for this scenario, but I'd rather not. It seems like something ParseIDs should support.

Let me know if I can help clarify this any further. I know it is not a typical use-case. Thank you!

created time in 5 days

pull request commentabsinthe-graphql/absinthe_plug

updated project list link

Thanks!

bmuller

comment created time in 6 days

push eventabsinthe-graphql/absinthe_plug

Brian Muller

commit sha f403e802ff74bb934b0699848ec29d193c553738

updated project list link

view details

Vince Foley

commit sha a3792a3d6d7124bedb7020025a0fa5eeabee365e

Merge pull request #245 from bmuller/fix-README-link updated project list link

view details

push time in 6 days

PR merged absinthe-graphql/absinthe_plug

updated project list link

Not sure where the project list link should go to, but the current location results in a 404.

+1 -1

0 comment

1 changed file

bmuller

pr closed time in 6 days

PR opened absinthe-graphql/absinthe_plug

updated project list link

Not sure where the project list link should go to, but the current location results in a 404.

+1 -1

0 comment

1 changed file

pr created time in 6 days

more