profile
viewpoint
José Valim josevalim @dashbitco Kraków, Poland https://dashbit.co/ Chief Adoption Officer at @dashbitco. Creator of @elixir-lang and proud ex-@plataformatec.

phoenixframework/phoenix_live_view 2657

Rich, real-time user experiences with server-rendered HTML

elixir-ecto/postgrex 711

PostgreSQL driver for Elixir

josevalim/enginex 452

An executable which creates a bare Rails 3 engine (which is used in Crafting Rails Applications)

msaraiva/surface 296

An experimental component based library for Phoenix LiveView

elixir-ecto/db_connection 174

Database connection behaviour

tmbb/makeup 83

Syntax highlighter for elixir inspired by Pygments

devinus/markdown 77

A simple Elixir Markdown to HTML conversion library

josevalim/xgen 57

Integrating Elixir, Mix and OTP

josevalim/defmodulep 55

API for defining and requiring private modules.

issue closedelixir-lang/ex_doc

Provide A Link In Hex Docs Package Referring To Hex Package Page

The documentation page for a package (Crawly on Hex Docs) should provide a link that refers back to the Hex package page (Crawly on Hex)

This will make the process of finding a package metadata, repository, etc simpler, easier and potentially improving SEO for a package.

closed time in 4 hours

s0kil

issue commentelixir-lang/ex_doc

Provide A Link In Hex Docs Package Referring To Hex Package Page

Hi @s0kil! ExDoc is a general documentation tool that is not tied to Hex, so we can’t always assume that there is a Hex package. For example, we use it for Elixir and Elixir is not a package. For this reason, we rather suggest for you to add links to the package source yourself. Thanks!

s0kil

comment created time in 4 hours

pull request commenterlang/otp

Allow changing the -mode when the system restarts

@rickard-green thank you. Feedback has been addressed.

josevalim

comment created time in 6 hours

push eventjosevalim/otp

José Valim

commit sha 24fc26590a5dccf0ae118ce38fb5d2f35ae11680

Allow changing the -mode when the system restarts In order for Elixir developers to configure their systems using Elixir syntax, the Elixir application must be started. However, in order to start the Elixir application, kernel and stdlib need to be started, which means we can't configure those applications. Elixir addresses this by booting twice. First, the system boots with a minimal set of apps started, then it executes the configuration files, writes them to disk and restarts the system, which will pick up the new configuration. The problemw with such approach is that, when running in embedded mode, starting and shutting down the system takes a long period of time, due to the loading and purging of all modules. This PR addresses the problem by allowing the mode to be changed to embedded on init:restart/1. This allows Elixir to first boot in interactive mode, load the configuration, and then start the system in embedded mode. This reduces the boot time in the sample application that reproduces the problem from 5s to 1s.

view details

push time in 6 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

+#!/usr/bin/env escript+%% -*- erlang -*-+%% %CopyrightBegin%+%%+%% Copyright Ericsson AB 2020. All Rights Reserved.+%%+%% Licensed under the Apache License, Version 2.0 (the "License");+%% you may not use this file except in compliance with the License.+%% You may obtain a copy of the License at+%%+%%     http://www.apache.org/licenses/LICENSE-2.0+%%+%% Unless required by applicable law or agreed to in writing, software+%% distributed under the License is distributed on an "AS IS" BASIS,+%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.+%% See the License for the specific language governing permissions and+%% limitations under the License.+%%+%% %CopyrightEnd%+%%----------------------------------------------------------------------+%% File    : chunk.escript+%%+%% Created : 1 Nov 2018 by Kenneth Lundin <uabkeld@elxa31hr002>+%%+%% Does translation of Erlang XML docs to EEP-48 doc chunks.+%%----------------------------------------------------------------------++-mode(compile).++-include_lib("kernel/include/eep48.hrl").++main([FromXML, FromBeam, _Escript, ToChunk]) ->+    erlang:process_flag(max_heap_size,20 * 1000 * 1000),+    case docs(FromXML, FromBeam) of+        {error, Reason} ->+            io:format("Failed to create chunks: ~p~n",[Reason]),+            erlang:halt(1);+        {docs_v1,_,_,_,_,#{ source := S },[]} when+              S =/= "../xml/gen_fsm.xml",+              S =/= "../xml/shell_default.xml",+              S =/= "../xml/user.xml",+              S =/= "../xml/wxClipboardTextEvent.xml",+              S =/= "../xml/wxDisplayChangedEvent.xml",+              S =/= "../xml/wxGBSizerItem.xml",+              S =/= "../xml/wxGraphicsBrush.xml",+              S =/= "../xml/wxGraphicsFont.xml",+              S =/= "../xml/wxGraphicsPen.xml",+              S =/= "../xml/wxInitDialogEvent.xml",+              S =/= "../xml/wxMaximizeEvent.xml",+              S =/= "../xml/wxMouseCaptureLostEvent.xml",+              S =/= "../xml/wxPaintEvent.xml",+              S =/= "../xml/wxPreviewCanvas.xml",+              S =/= "../xml/wxSysColourChangedEvent.xml",+              S =/= "../xml/wxTaskBarIconEvent.xml",+              S =/= "../xml/wxWindowCreateEvent.xml",+              S =/= "../xml/wxWindowDestroyEvent.xml",+              S =/= "../xml/wxDataObject.xml"+              ->+            io:format("Failed to create chunks: no functions found ~s~n",[S]),+            erlang:halt(1),+            ok;+        Docs ->+            ok = file:write_file(ToChunk, term_to_binary(Docs,[compressed]))+    end.++%% Error handling+%%----------------------------------------------------------------------++-define(error(Reason), +	throw({dom_error, Reason})).++%%----------------------------------------------------------------------++%%======================================================================+%% Records+%%======================================================================++%%----------------------------------------------------------------------+%% State record for the validator+%%----------------------------------------------------------------------+-record(state, {+	  tags=[],         %% Tag stack+	  cno=[],          %% Current node number+	  namespaces = [], %% NameSpace stack+	  dom=[]           %% DOM structure+	 }).++%%======================================================================+%% External functions+%%======================================================================++%%----------------------------------------------------------------------+%% Function: initial_state() -> Result+%% Parameters: +%% Result: +%% Description:+%%----------------------------------------------------------------------+initial_state() ->+    #state{}.++%%----------------------------------------------------------------------+%% Function: get_dom(State) -> Result+%% Parameters: +%% Result: +%% Description:+%%----------------------------------------------------------------------+get_dom(#state{dom=Dom}) ->+    Dom.++%%----------------------------------------------------------------------+%% Function: event(Event, LineNo, State) -> Result+%% Parameters: +%% Result: +%% Description:+%%----------------------------------------------------------------------+event(Event, _LineNo, State) ->+    build_dom(Event, State).+++%%======================================================================+%% Internal functions+%%======================================================================++%%----------------------------------------------------------------------+%% Function  : build_dom(Event, State) -> Result+%% Parameters: Event = term()+%%             State = #xmerl_sax_simple_dom_state{}+%% Result    : #xmerl_sax_simple_dom_state{} |+%% Description: +%%----------------------------------------------------------------------++%% Document+%%----------------------------------------------------------------------+build_dom(startDocument, State) ->+    State#state{dom=[startDocument]};+build_dom(endDocument, +	  #state{dom=[{Tag, Attributes, Content} |D]} = State) ->+    case D of+	[startDocument] ->+	    State#state{dom=[{Tag, Attributes, +                              lists:reverse(Content)}]};+	[Decl, startDocument] ->+	    State#state{dom=[Decl, {Tag, Attributes, +                                    lists:reverse(Content)}]};+	_ ->+            %% endDocument is also sent by the parser when a fault occur to tell +            %% the event receiver that no more input will be sent+	    State+    end;++%% Element+%%----------------------------------------------------------------------+build_dom({startElement, _Uri, LocalName, _QName, Attributes}, +	  #state{tags=T, dom=D} = State) ->++    A = parse_attributes(LocalName, Attributes),+    CName = list_to_atom(LocalName),++    State#state{tags=[CName |T],+                dom=[{CName,+                      lists:reverse(A),+                      []+                     } | D]};+build_dom({endElement, _Uri, LocalName, _QName}, +	  #state{tags=[_ |T],+                 dom=[{CName, CAttributes, CContent}, +                      {PName, PAttributes, PContent} = _Parent | D]} = State) ->+    case list_to_atom(LocalName) of+	CName ->+            MappedCName =+                case CName of+                    title ->+                        lists:nth(length([E || E <- T, E =:= section])+1,[h1,h2,h3]);+                    CName -> CName+                end,+                    +            State#state{tags=T,+                        dom=[{PName, PAttributes, +                              [{MappedCName, CAttributes, +                                lists:reverse(CContent)}+                               |PContent]+                             } | D]};+        _ ->+            ?error("Got end of element: " ++ LocalName ++ " but expected: " ++ +                       CName)+    end;++%% Text +%%----------------------------------------------------------------------+build_dom({characters, String},+	  #state{dom=[{Name, Attributes, Content}| D]} = State) ->+    HtmlEnts = [{"&nbsp;",[160]},+                {"&times;",[215]},+                {"&plusmn;",[177]},+                {"&ouml;","ö"},+                {"&auml;","ä"},+                {"&aring;","å"}+               ],++    NoHtmlEnt =+        lists:foldl(+          fun({Pat,Sub},Str) ->+                  re:replace(Str,Pat,Sub,[global,unicode])+          end,String,HtmlEnts),++    case re:run(NoHtmlEnt,"&[a-z]*;",[{capture,first,binary},unicode]) of+        nomatch -> ok;+        {match,[<<"&lt;">>]} -> ok;+        {match,[<<"&gt;">>]} -> ok;+        Else -> throw({found_illigal_thing,Else,String})+    end,+    NewContent =+        [unicode:characters_to_binary(NoHtmlEnt,utf8)| Content],+    State#state{dom=[{Name, Attributes, NewContent} | D]};++build_dom({ignorableWhitespace, String},+          #state{dom=[{Name,_,_} = _E|_]} = State) ->+    case lists:member(Name,+                      [p,pre,input,code,quote,warning,+                       note,dont,do,c,i,em,strong,+                       seealso,tag,item]) of+        true ->+%            io:format("Keep ign white: ~p ~p~n",[String, _E]),+            build_dom({characters, String}, State);+        false ->+            State+    end;++build_dom({startEntity, SysId}, State) ->+    io:format("startEntity:~p~n",[SysId]),+    State;++%% Default+%%----------------------------------------------------------------------+build_dom(_E, State) ->+    State. ++%%----------------------------------------------------------------------+%% Function  : parse_attributes(ElName, Attributes) -> Result+%% Parameters: +%% Result    : +%% Description: +%%----------------------------------------------------------------------+parse_attributes(ElName, Attributes) ->+    parse_attributes(ElName, Attributes, 1, []).++parse_attributes(_, [], _, Acc) ->+    Acc;+parse_attributes(ElName, [{_Uri, _Prefix, LocalName, AttrValue} |As], N, Acc) ->  +    parse_attributes(ElName, As, N+1, [{list_to_atom(LocalName), AttrValue} |Acc]).++docs(OTPXml, FromBEAM)->+    case xmerl_sax_parser:file(OTPXml,+                               [skip_external_dtd,+                                {event_fun,fun event/3},+                                {event_state,initial_state()}]) of+        {ok,Tree,_} ->+            {ok, {Module, Chunks}} = beam_lib:chunks(FromBEAM,[exports,abstract_code]),+            Dom = get_dom(Tree),+            NewDom = transform(Dom,[]),+            Chunk = to_chunk(NewDom, OTPXml, Module, proplists:get_value(abstract_code, Chunks)),+            verify_chunk(Module,proplists:get_value(exports, Chunks), Chunk),+            Chunk;+        Else ->+            {error,Else}+    end.++verify_chunk(M, Exports, #docs_v1{ docs = Docs } = Doc) ->++    %% Make sure that each documented function actually is exported+    Exported = [begin+                    FA = {F,A},+                    {M,F,A,lists:member(FA,Exports)}+                end || {{function,F,A},_,_,_,_} <- Docs],+    lists:map(fun({_M,_F,_A,true}) ->+                      ok+              end,Exported),++    try+        shell_docs:validate(Doc)+    catch Err ->+            throw({binary_to_term(maps:get(<<"en">>,Doc#docs_v1.module_doc)), Err})+    end.++%% skip <erlref> but transform and keep its content+transform([{erlref,_Attr,Content}|T],Acc) ->+    Module = [Mod || Mod = {module,_,_} <- Content], +    NewContent = Content -- Module,+    [{module,SinceAttr,[Mname]}] = Module,+    Since = case proplists:get_value(since,SinceAttr) of+                undefined -> [];+                [] -> [];+                Vsn -> [{since,Vsn}]+            end,+    transform([{module,[{name,Mname}|Since],NewContent}|T],Acc);++%% skip <header> and all of its content+transform([{header,_Attr,_Content}|T],Acc) ->+    transform(T,Acc);++transform([{section,_,Content}|T],Acc) ->+    transform([{p,[],transform(Content,[])}|T],Acc);+transform([{description,_,Content}|T],Acc) ->+    transform([{p,[],transform(Content,[])}|T],Acc);++%% transform <list><item> to <ul><li> or <ol><li> depending on type attribute +transform([{list,Attr,Content}|T],Acc) ->+    transform([transform_list(Attr,Content)|T],Acc);++%% transform <taglist>(tag,item+)+ to <dl>(dt,item+)++transform([{taglist,Attr,Content}|T],Acc) ->+    transform([transform_taglist(Attr,Content)|T],Acc);++%% transform <c><anno>text</anno></c> to <anno>text</anno>+transform([{c,[],[{anno,[],AnnoContent}]}|T],Acc) ->+    transform(T,[{a,[{type,anno}],AnnoContent}|Acc]);++%% transform <funcs> with <func> as children+transform([{funcs,_Attr,Content}|T],Acc) ->+    Fns = {functions,[],transform_funcs(Content, [])},+    transform(T,[Fns|Acc]);+%% transform <datatypes> with <datatype> as children+transform([{datatypes,_Attr,Content}|T],Acc) ->+    Dts = transform(Content, []),+    transform(T,[{datatypes,[],Dts}|Acc]);+transform([{datatype,_Attr,Content}|T],Acc) ->+    transform(T,transform_datatype(Content, []) ++ Acc);+%% Ignore <datatype_title>+transform([{datatype_title,_Attr,_Content}|T],Acc) ->+    transform(T,Acc);+%% transform <desc>Content</desc> to Content+transform([{desc,_Attr,Content}|T],Acc) ->+    transform(T,[transform(Content,[])|Acc]);+transform([{strong,Attr,Content}|T],Acc) ->+    transform([{em,Attr,Content}|T],Acc);+%% transform <marker id="name"/>  to <a id="name"/>....+transform([{marker,Attr,Content}|T],Acc) ->+    transform(T,[{a,Attr,transform(Content,[])}|Acc]);+%% transform <url href="external URL"> Content</url> to <a href....+transform([{url,Attr,Content}|T],Acc) ->+    transform(T,[{a,Attr,transform(Content,[])}|Acc]);+%% transform note/warning/do/don't to <p class="thing">+transform([{What,[],Content}|T],Acc)+  when What =:= note; What =:= warning; What =:= do; What =:= dont ->+    WhatP = {p,[{class,atom_to_list(What)}], transform(Content,[])},+    transform(T,[WhatP|Acc]);++transform([{type,_,[]}|_] = Dom,Acc) ->+    %% Types are laid out sequentially in the source xml so we need to+    %% parse them like that here too.+    case transform_types(Dom,[]) of+        {[],T} ->+            transform(T,Acc);+        {Types,T} ->+            %% We sort the types here because in the source xml+            %% the description and the declaration do not have+            %% to be next to each other. But we want to have that+            %% for the doc chunks.+            NameSort = fun({li,A,_},{li,B,_}) ->+                               NameA = proplists:get_value(name,A),+                               NameB = proplists:get_value(name,B),+                               if NameA == NameB ->+                                       length(A) =< length(B);+                                  true ->+                                       NameA < NameB+                               end+                       end,+            transform(T,[{ul,[{class,"types"}],lists:sort(NameSort,Types)}|Acc])+    end;+transform([{type_desc,Attr,_Content}|T],Acc) ->+    %% We skip any type_desc with the variable attribute+    true = proplists:is_defined(variable, Attr),+    transform(T,Acc);+transform([{type,[],Content}|T],Acc) ->+    transform(T,[{ul,[{class,"types"}],transform(Content,[])}|Acc]);+transform([{v,[],Content}|T],Acc) ->+    transform(T, [{li,[{class,"type"}],transform(Content,[])}|Acc]);+transform([{d,[],Content}|T],Acc) ->+    transform(T, [{li,[{class,"description"}],transform(Content,[])}|Acc]);++transform([Tag = {seealso,_Attr,_Content}|T],Acc) ->+    transform(T,[transform_seealso(Tag)|Acc]);++transform([{term,Attr,[]}|T],Acc) ->+    transform([list_to_binary(proplists:get_value(id,Attr))|T],Acc);++transform([{fsummary,_,_}|T],Acc) ->+    %% We skip fsummary as it many times is just a duplicate of the+    %% first line of the docs.+    transform(T,Acc);++transform([{input,_,Content}|T],Acc) ->+    %% Just remove input as it is not used by anything+    transform(T,[transform(Content,[])|Acc]);++%% Tag and Attr is used as is but Content is transformed+transform([{Tag,Attr,Content}|T],Acc) ->+    transform(T,[{Tag,Attr,transform(Content,[])}|Acc]);+transform([Binary|T],Acc) ->+    transform(T,[Binary|Acc]);+transform([],Acc) ->+    lists:flatten(lists:reverse(Acc)).++transform_list([{type,"ordered"}],Content) ->+    {ol,[],[{li,A2,C2}||{item,A2,C2}<-Content]};+transform_list(_,Content) ->+    {ul,[],[{li,A2,C2}||{item,A2,C2}<-Content]}.++transform_types([{type,Attr,[]}|T],Acc) ->+    case proplists:is_defined(name,Attr) of+        true ->+            transform_types(T, [{li,Attr,[]}|Acc]);+        false ->+            true = proplists:is_defined(variable, Attr),+            transform_types(T, Acc)+    end;+transform_types([{type_desc,Attr,Content}|T],Acc) ->+    case proplists:is_defined(name,Attr) of+        true ->+            TypeDesc = transform(Content,[]),+            transform_types(T, [{li,Attr ++ [{class,"description"}],TypeDesc}|Acc]);+        false ->+            true = proplists:is_defined(variable, Attr),+            transform_types(T, Acc)+    end;+transform_types([{type,_,_}|_T],_Acc) ->+    throw(mixed_type_declarations);+transform_types(Dom,Acc) ->+    {lists:reverse(Acc),Dom}.++transform_taglist(Attr,Content) ->+    Items =+        lists:map(fun({tag,A,C}) ->+                          {dt,A,transform(C, [])};+                     ({item,A,C}) ->+                          {dd,A,transform(C, [])}+                  end, Content),+    {dl,Attr,Items}.++%% if we have {func,[],[{name,...},{name,....},...]}+%% we convert it to one {func,[],[{name,...}] per arity lowest first.    +transform_funcs([Func|T],Acc) ->+    transform_funcs(T,func2func(Func) ++ Acc);+transform_funcs([],Acc) ->+    lists:reverse(Acc).++func2func({func,Attr,Contents}) ->++    ContentsNoName = [NC||NC <- Contents, element(1,NC) /= name],++    EditLink =+        case proplists:get_value(ghlink,Attr) of+            undefined ->+                #{};+            GhLink ->+                #{ edit_url =>+                       iolist_to_binary(["https://github.com/erlang/otp/edit/",GhLink]) }+        end,++    VerifyNameList =+        fun(NameList, Test) ->+                %% Assert that we don't mix ways to write <name>+                _ =+                    [begin+                         ok = Test(C),+                         {proplists:get_value(name,T),proplists:get_value(arity,T)}+                     end || {name,T,C} <- NameList]+        end,++    NameList = [Name || {name,_,_} = Name <- Contents],++    %% "Since" is hard to accurately as there can be multiple <name> per <func> and they+    %% can refer to the same or other arities. This should be improved in the future but+    %% for now we set since to a comma separated list of all since attributes.+    SinceMD =+        case [proplists:get_value(since, SinceAttr) ||+                 {name,SinceAttr,_} <- NameList, proplists:get_value(since, SinceAttr) =/= []] of+            [] -> EditLink;+            Sinces ->+                EditLink#{ since => unicode:characters_to_binary(+                                      lists:join(",",lists:usort(Sinces))) }+        end,++    Functions =+        case NameList of+            [{name,_,[]}|_] ->+                %% Spec style function docs+                TagsToFA =+                    fun(Tags) ->+                            {proplists:get_value(name,Tags),+                             proplists:get_value(arity,Tags)}+                    end,++                VerifyNameList(NameList,fun([]) -> ok end),++                FAs = [TagsToFA(FAttr) || {name,FAttr,[]} <- NameList ],+                FAClauses = lists:usort([{TagsToFA(FAttr),proplists:get_value(clause_i,FAttr)}+                                         || {name,FAttr,[]} <- NameList ]),+                Signature = [iolist_to_binary([F,"/",A]) || {F,A} <- FAs],+                lists:map(+                  fun({F,A}) ->+                          Specs = [{func_to_atom(CF),list_to_integer(CA),C}+                                   || {{CF,CA},C} <- FAClauses,+                                      F =:= CF, A =:= CA],+                          {function,[{name,F},{arity,list_to_integer(A)},+                                     {signature,Signature},+                                     {meta,SinceMD#{ signature => Specs }}],+                           ContentsNoName}+                  end, lists:usort(FAs));+            NameList ->+                %% Manual style function docs+                FAs = lists:flatten([func_to_tuple(NameString) || {name, _Attr, NameString} <- NameList]),++                VerifyNameList(NameList,fun([_|_]) -> ok end),++                Signature = [strip_tags(NameString) || {name, _Attr, NameString} <- NameList],+                [{function,[{name,F},{arity,A},+                            {signature,Signature},+                            {meta,SinceMD}],ContentsNoName}+                 || {F,A} <- lists:usort(FAs)]+        end,+    transform(Functions,[]).++func_to_tuple(Chars) ->+    try+        [Name,Args] = string:split(strip_tags(Chars),"("),+        Arities = parse_args(unicode:characters_to_list(Args)),+        [{unicode:characters_to_list(Name),Arity} || Arity <- Arities]+    catch E:R:ST ->+            io:format("Failed to parse: ~p~n",[Chars]),+            erlang:raise(E,R,ST)+    end.++%% This function parses a documentation <name> attribute to figure+%% out the arities if that function. Example:+%%    "start([go,Mode] [,Extra])" returns [1, 2].+%%+%% This assumes that when a single <name> describes many arities+%% the arities are listed with [, syntax.+parse_args(")" ++ _) ->+    [0];+parse_args(Args) ->+    parse_args(unicode:characters_to_list(Args),1,[]).+parse_args([$[,$,|T],Arity,[]) ->+    parse_args(T,Arity,[$[]) ++ parse_args(T,Arity+1,[]);+parse_args([$,|T],Arity,[]) ->+    parse_args(T,Arity+1,[]);+parse_args([Open|T],Arity,Stack)+  when Open =:= $[; Open =:= ${; Open =:= $( ->+    parse_args(T,Arity,[Open|Stack]);+parse_args([$]|T],Arity,[$[|Stack]) ->+    parse_args(T,Arity,Stack);+parse_args([$}|T],Arity,[${|Stack]) ->+    parse_args(T,Arity,Stack);+parse_args([$)|T],Arity,[$(|Stack]) ->+    parse_args(T,Arity,Stack);+parse_args([$)|_T],Arity,[]) ->+    [Arity];+parse_args([_H|T],Arity,Stack) ->+    parse_args(T,Arity,Stack).++strip_tags([{_Tag,_Attr,Content}|T]) ->+    [Content | strip_tags(T)];+strip_tags([H|T]) when not is_tuple(H) ->+    [H | strip_tags(T)];+strip_tags([]) ->+    [].++transform_datatype(Dom,_Acc) ->+    ContentsNoName = transform([NC||NC <- Dom, element(1,NC) /= name],[]),+    [case N of+          {name,NameAttr,[]} ->+              {datatype,NameAttr,ContentsNoName};+          {name,[],Content} ->+              [{Name,Arity}] = func_to_tuple(Content),+              Signature = strip_tags(Content),+              {datatype,[{name,Name},{n_vars,integer_to_list(Arity)},+                         {signature,Signature}],ContentsNoName}+      end || N = {name,_,_} <- Dom].++transform_seealso(_S = {seealso,_Attr,_Content}) ->+    _Content.++to_chunk(Dom, Source, Module, AST) ->+    [{module,MAttr,Mcontent}] = Dom,++    ModuleDocs = lists:flatmap(+                   fun({p,_,Content}) ->+                           Content;+                      ({_,_,_}) ->+                           []+                   end, Mcontent),++    TypeMeta = add_types(AST, maps:from_list([{source,Source}|MAttr])),++    TypeMap = maps:get(types, TypeMeta, []),++    Anno = erl_anno:set_file(atom_to_list(Module)++".erl",erl_anno:new(0)),++    Types = lists:flatten([Types || {datatypes,[],Types} <- Mcontent]),++    TypeEntries =+        lists:map(+          fun({datatype,Attr,Descr}) ->+                  TypeName = func_to_atom(proplists:get_value(name,Attr)),+                  TypeArity = case proplists:get_value(n_vars,Attr) of+                                  undefined ->+                                      find_type_arity(TypeName, TypeMap);+                                  Arity ->+                                      list_to_integer(Arity)+                              end,+                  TypeArgs = lists:join(",",[lists:concat(["Arg",I]) || I <- lists:seq(1,TypeArity)]),+                  PlaceholderSig = io_lib:format("-type ~p(~s) :: term().",[TypeName,TypeArgs]),+                  TypeSignature = proplists:get_value(+                                    signature,Attr,[iolist_to_binary(PlaceholderSig)]),+                  MetaSig =+                      case maps:get({TypeName, TypeArity}, TypeMap, undefined) of+                          undefined ->+                              #{};+                          Sig ->+                              #{ signature => [Sig] }+                      end,+                  docs_v1_entry(type, Anno, TypeName, TypeArity, TypeSignature, MetaSig, Descr)+          end, Types),++    Functions = lists:flatten([Functions || {functions,[],Functions} <- Mcontent]),++    FuncEntrys =+        lists:flatmap(+          fun({function,Attr,Fdoc}) ->+                  case func_to_atom(proplists:get_value(name,Attr)) of+                      callback ->+                          [];+                      Name ->+                          Arity = proplists:get_value(arity,Attr),+                          Signature = proplists:get_value(signature,Attr),+                          FMeta = proplists:get_value(meta,Attr),+                          MetaWSpec = add_spec(AST,FMeta),+                          [docs_v1_entry(function, Anno, Name, Arity, Signature, MetaWSpec, Fdoc)]+                  end+          end, Functions),++    docs_v1(ModuleDocs, Anno, TypeMeta, FuncEntrys ++ TypeEntries).++docs_v1(DocContents, Anno, Metadata, Docs) ->+    #docs_v1{ anno = Anno,+              module_doc = #{<<"en">> => term_to_binary(shell_docs:normalize(DocContents))},

We had discussions about this and we were considering updating EEP 48 to allow any term to be given here, so we don't do a double term_to_binary. What are your final thoughts? I can send a PR to EEP 48 later today if we want to lift this restriction. We already need to update it to mention edit_url.

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

 where_is_file(Tail, File, Path, Files) ->             where_is_file(Tail, File)     end. +-spec get_doc(Mod) -> {ok, Res} | {error, Reason} when+      Mod :: module(),+      Res :: #docs_v1{},+      Reason :: non_existing | missing | file:posix().+get_doc(Mod) when is_atom(Mod) ->+    case which(Mod) of+        preloaded ->+            Fn = filename:join([code:lib_dir(erts),"ebin",atom_to_list(Mod) ++ ".beam"]),+            get_doc_chunk(Fn, Mod);+        Error when is_atom(Error) ->+            {error, Error};+        Fn ->+            get_doc_chunk(Fn, Mod)+    end.++get_doc_chunk(Filename, Mod) when is_atom(Mod) ->+    case beam_lib:chunks(Filename, ["Docs"]) of+        {error,beam_lib,{missing_chunk,_,_}} ->                +            case get_doc_chunk(Filename, atom_to_list(Mod)) of+                {error,missing} ->+                    get_doc_chunk_from_ast(Filename);+                Error ->+                    Error+            end;+        {error,beam_lib,{file_error,Filename,enoent}} ->+            get_doc_chunk(Filename, atom_to_list(Mod));+        {ok, {Mod, [{"Docs",Bin}]}} ->+            binary_to_term(Bin)+    end;+get_doc_chunk(Filename, Mod) ->+    case filename:dirname(Filename) of+        Filename ->+            {error,missing};+        Dir ->+            ChunkFile = filename:join([Dir,"doc","chunks",Mod ++ ".chunk"]),+            case file:read_file(ChunkFile) of+                {ok, Bin} ->+                    {ok, binary_to_term(Bin)};+                {error,enoent} ->+                    get_doc_chunk(Dir, Mod);+                {error,Reason} ->+                    {error,Reason}+            end+    end.++get_doc_chunk_from_ast(Filename) ->+    case beam_lib:chunks(Filename, [abstract_code]) of+        {error,beam_lib,{missing_chunk,_,_}} ->+            {error,missing};+        {ok, {_Mod, [{abstract_code,+                      {raw_abstract_v1, AST}}]}} ->+            Docs = get_function_docs_from_ast(AST),+            {ok, #docs_v1{ anno = 0, beam_language = erlang, format =  <<"text/erlang_doc">>,

Should this be ?NATIVE_FORMAT or <<"text/erlang_doc">>?

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

+%%+%% %CopyrightBegin%+%%+%% Copyright Ericsson AB 1996-2018. All Rights Reserved.+%%+%% Licensed under the Apache License, Version 2.0 (the "License");+%% you may not use this file except in compliance with the License.+%% You may obtain a copy of the License at+%%+%%     http://www.apache.org/licenses/LICENSE-2.0+%%+%% Unless required by applicable law or agreed to in writing, software+%% distributed under the License is distributed on an "AS IS" BASIS,+%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.+%% See the License for the specific language governing permissions and+%% limitations under the License.+%%+%% %CopyrightEnd%+%%+-module(shell_docs).++-include("eep48.hrl").++-export([render/2, render/3, render/4]).+-export([render_type/2, render_type/3, render_type/4]).++%% Used by chunks.escript in erl_docgen+-export([validate/1, normalize/1]).++%% Convinience functions+-export([get_doc/1, get_doc/3, get_type_doc/3]).++-record(config, { docs,+                  io_opts = io:getopts(),+                  io_columns = element(2,io:columns())+                }).++-define(ALL_TAGS,[a,anno,p,h1,h2,h3,c,i,br,em,pre,code,ul,ol,li,dl,dt,dd]).+-type chunk_element_type() :: a | anno | p | c | i | br | em | pre |+                              code | ul | ol | li | dl | dt | dd.+-type chunk_element_attr() :: {atom(),unicode:chardata()}.+-type chunk_element_attrs() :: [chunk_element_attr()].+-type chunk_element() :: {chunk_element_type(),chunk_element_attrs(),+                          chunk_elements()} | binary().+-type chunk_elements() :: [chunk_element()].+-type docs_v1() :: #docs_v1{}.+++-spec validate(Module) -> ok when+      Module :: module() | docs_v1().+%% Simple validation of erlang doc chunk. Check that all tags are supported and+%% that the signature is correct.+validate(Module) when is_atom(Module) ->+    {ok, Doc} = code:get_doc(Module),+    validate(Doc);+validate(#docs_v1{ module_doc = MDocs, docs = AllDocs }) ->+    _ = maps:map(fun(_Key,MDoc) -> validate(binary_to_term(MDoc)) end, MDocs),+    lists:map(fun({_,_Anno, Sig, Docs, _Meta}) ->+                      case lists:all(fun erlang:is_binary/1, Sig) of+                          false -> throw({invalid_signature,Sig});+                          true -> ok+                      end,+                      maps:map(fun(_Key,Doc) -> validate(binary_to_term(Doc)) end, Docs)+              end, AllDocs);+validate([H|T]) when is_tuple(H) ->+    _ = validate(H),+    validate(T);+validate({Tag,Attr,Content}) ->+    case lists:member(Tag,?ALL_TAGS) of+        false ->+            throw({invalid_tag,Tag});+        true ->+            ok+    end,+    true = is_list(Attr),+    validate(Content);+validate([Chars | T]) when is_binary(Chars) ->+    validate(T);+validate([]) ->+    ok.++%% Follows algorithm described here:+%% * https://medium.com/@patrickbrosset/when-does-white-space-matter-in-html-b90e8a7cdd33+%% which in turn follows this:+%% * https://www.w3.org/TR/css-text-3/#white-space-processing+-spec normalize(Docs) -> NormalizedDocs when+      Docs :: chunk_elements(),+      NormalizedDocs :: chunk_elements().+normalize(Docs) ->+    Trimmed = normalize_trim(Docs,true),+    normalize_space(Trimmed).++normalize_trim(Bin,true) when is_binary(Bin) ->+    %% Remove any whitespace (except \n) before or after a newline+    NoSpace = re:replace(Bin,"[^\\S\n]*\n+[^\\S\n]*","\n",[global]),+    %% Replace any tabs with space+    NoTab = re:replace(NoSpace,"\t"," ",[global]),+    %% Replace any newlines with space+    NoNewLine = re:replace(NoTab,"\\v"," ",[global]),+    %% Replace any sequences of \s with a single " "+    re:replace(NoNewLine,"\\s+"," ",[global,{return,binary}]);+normalize_trim(Bin,false) when is_binary(Bin) ->+    Bin;+normalize_trim([{Tag,Attr,Content}|T],Trim) when Tag =:= pre;+                                                 Tag =:= code ->+    [{Tag,Attr,normalize_trim(Content,false)} | normalize_trim(T,Trim)];+normalize_trim([{Tag,Attr,Content}|T],Trim) ->+    [{Tag,Attr,normalize_trim(Content,Trim)} | normalize_trim(T,Trim)];+normalize_trim([<<>>|T],Trim) ->+    normalize_trim(T,Trim);+normalize_trim([B1,B2|T],Trim) when is_binary(B1),is_binary(B2) ->+    normalize_trim([<<B1/binary,B2/binary>> | T],Trim);+normalize_trim([H|T],Trim) ->+    [normalize_trim(H,Trim) | normalize_trim(T,Trim)];+normalize_trim([],_Trim) ->+    [].

Fantastic!

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

 where_is_file(Tail, File, Path, Files) ->             where_is_file(Tail, File)     end. +-spec get_doc(Mod) -> {ok, Res} | {error, Reason} when+      Mod :: module(),+      Res :: #docs_v1{},+      Reason :: non_existing | missing | file:posix().+get_doc(Mod) when is_atom(Mod) ->+    case which(Mod) of+        preloaded ->+            Fn = filename:join([code:lib_dir(erts),"ebin",atom_to_list(Mod) ++ ".beam"]),+            get_doc_chunk(Fn, Mod);+        Error when is_atom(Error) ->+            {error, Error};+        Fn ->+            get_doc_chunk(Fn, Mod)+    end.++get_doc_chunk(Filename, Mod) when is_atom(Mod) ->+    case beam_lib:chunks(Filename, ["Docs"]) of+        {error,beam_lib,{missing_chunk,_,_}} ->                +            case get_doc_chunk(Filename, atom_to_list(Mod)) of+                {error,missing} ->+                    get_doc_chunk_from_ast(Filename);+                Error ->+                    Error+            end;+        {error,beam_lib,{file_error,Filename,enoent}} ->+            get_doc_chunk(Filename, atom_to_list(Mod));+        {ok, {Mod, [{"Docs",Bin}]}} ->+            binary_to_term(Bin)+    end;+get_doc_chunk(Filename, Mod) ->+    case filename:dirname(Filename) of+        Filename ->+            {error,missing};+        Dir ->+            ChunkFile = filename:join([Dir,"doc","chunks",Mod ++ ".chunk"]),+            case file:read_file(ChunkFile) of+                {ok, Bin} ->+                    {ok, binary_to_term(Bin)};+                {error,enoent} ->+                    get_doc_chunk(Dir, Mod);+                {error,Reason} ->+                    {error,Reason}+            end+    end.++get_doc_chunk_from_ast(Filename) ->

I see. We do the same but the generation of the "fake chunk" is inside our "shell_docs". I guess some tools would find the "fake chunk" useful but others may not, so I guess both approaches work. Thanks for the info!

garazdawi

comment created time in 7 hours

pull request commenterlang/otp

Introduce EEP-48 and help functions in shell

At the moment there are no header tags, but this morning I found some uses of them so I think I will add them. I assume <h1>, <h2> and <h3> should be enough?

FWIW, Elixir assumes that <h1> is always the signature (i.e. it is implicit, you never write it yourself) and people may use <h2> and <h3> in the docs. For instance, we would have a <h2>Examples</h2> heading when listing examples.

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

+%%+%% %CopyrightBegin%+%%+%% Copyright Ericsson AB 1996-2018. All Rights Reserved.+%%+%% Licensed under the Apache License, Version 2.0 (the "License");+%% you may not use this file except in compliance with the License.+%% You may obtain a copy of the License at+%%+%%     http://www.apache.org/licenses/LICENSE-2.0+%%+%% Unless required by applicable law or agreed to in writing, software+%% distributed under the License is distributed on an "AS IS" BASIS,+%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.+%% See the License for the specific language governing permissions and+%% limitations under the License.+%%+%% %CopyrightEnd%+%%+-module(shell_docs).++-include("eep48.hrl").++-export([render/2, render/3, render/4]).+-export([render_type/2, render_type/3, render_type/4]).++%% Used by chunks.escript in erl_docgen+-export([validate/1, normalize/1]).++%% Convinience functions+-export([get_doc/1, get_doc/3, get_type_doc/3]).++-record(config, { docs,+                  io_opts = io:getopts(),+                  io_columns = element(2,io:columns())+                }).++-define(ALL_TAGS,[a,anno,p,h1,h2,h3,c,i,br,em,pre,code,ul,ol,li,dl,dt,dd]).+-type chunk_element_type() :: a | anno | p | c | i | br | em | pre |+                              code | ul | ol | li | dl | dt | dd.+-type chunk_element_attr() :: {atom(),unicode:chardata()}.+-type chunk_element_attrs() :: [chunk_element_attr()].+-type chunk_element() :: {chunk_element_type(),chunk_element_attrs(),+                          chunk_elements()} | binary().+-type chunk_elements() :: [chunk_element()].+-type docs_v1() :: #docs_v1{}.+++-spec validate(Module) -> ok when+      Module :: module() | docs_v1().+%% Simple validation of erlang doc chunk. Check that all tags are supported and+%% that the signature is correct.+validate(Module) when is_atom(Module) ->+    {ok, Doc} = code:get_doc(Module),+    validate(Doc);+validate(#docs_v1{ module_doc = MDocs, docs = AllDocs }) ->+    _ = maps:map(fun(_Key,MDoc) -> validate(binary_to_term(MDoc)) end, MDocs),+    lists:map(fun({_,_Anno, Sig, Docs, _Meta}) ->+                      case lists:all(fun erlang:is_binary/1, Sig) of+                          false -> throw({invalid_signature,Sig});+                          true -> ok+                      end,+                      maps:map(fun(_Key,Doc) -> validate(binary_to_term(Doc)) end, Docs)+              end, AllDocs);+validate([H|T]) when is_tuple(H) ->+    _ = validate(H),+    validate(T);+validate({Tag,Attr,Content}) ->+    case lists:member(Tag,?ALL_TAGS) of+        false ->+            throw({invalid_tag,Tag});+        true ->+            ok+    end,+    true = is_list(Attr),+    validate(Content);+validate([Chars | T]) when is_binary(Chars) ->+    validate(T);+validate([]) ->+    ok.++%% Follows algorithm described here:+%% * https://medium.com/@patrickbrosset/when-does-white-space-matter-in-html-b90e8a7cdd33+%% which in turn follows this:+%% * https://www.w3.org/TR/css-text-3/#white-space-processing+-spec normalize(Docs) -> NormalizedDocs when+      Docs :: chunk_elements(),+      NormalizedDocs :: chunk_elements().+normalize(Docs) ->+    Trimmed = normalize_trim(Docs,true),+    normalize_space(Trimmed).++normalize_trim(Bin,true) when is_binary(Bin) ->+    %% Remove any whitespace (except \n) before or after a newline+    NoSpace = re:replace(Bin,"[^\\S\n]*\n+[^\\S\n]*","\n",[global]),+    %% Replace any tabs with space+    NoTab = re:replace(NoSpace,"\t"," ",[global]),+    %% Replace any newlines with space+    NoNewLine = re:replace(NoTab,"\\v"," ",[global]),+    %% Replace any sequences of \s with a single " "+    re:replace(NoNewLine,"\\s+"," ",[global,{return,binary}]);+normalize_trim(Bin,false) when is_binary(Bin) ->+    Bin;+normalize_trim([{Tag,Attr,Content}|T],Trim) when Tag =:= pre;+                                                 Tag =:= code ->+    [{Tag,Attr,normalize_trim(Content,false)} | normalize_trim(T,Trim)];+normalize_trim([{Tag,Attr,Content}|T],Trim) ->+    [{Tag,Attr,normalize_trim(Content,Trim)} | normalize_trim(T,Trim)];+normalize_trim([<<>>|T],Trim) ->+    normalize_trim(T,Trim);+normalize_trim([B1,B2|T],Trim) when is_binary(B1),is_binary(B2) ->+    normalize_trim([<<B1/binary,B2/binary>> | T],Trim);+normalize_trim([H|T],Trim) ->+    [normalize_trim(H,Trim) | normalize_trim(T,Trim)];+normalize_trim([],_Trim) ->+    [].

Please ignore. I see this is already used by docgen. :D

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

+%%+%% %CopyrightBegin%+%%+%% Copyright Ericsson AB 1996-2018. All Rights Reserved.+%%+%% Licensed under the Apache License, Version 2.0 (the "License");+%% you may not use this file except in compliance with the License.+%% You may obtain a copy of the License at+%%+%%     http://www.apache.org/licenses/LICENSE-2.0+%%+%% Unless required by applicable law or agreed to in writing, software+%% distributed under the License is distributed on an "AS IS" BASIS,+%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.+%% See the License for the specific language governing permissions and+%% limitations under the License.+%%+%% %CopyrightEnd%+%%+-module(shell_docs).++-include("eep48.hrl").++-export([render/2, render/3, render/4]).+-export([render_type/2, render_type/3, render_type/4]).++%% Used by chunks.escript in erl_docgen+-export([validate/1, normalize/1]).++%% Convinience functions+-export([get_doc/1, get_doc/3, get_type_doc/3]).++-record(config, { docs,+                  io_opts = io:getopts(),+                  io_columns = element(2,io:columns())+                }).++-define(ALL_TAGS,[a,anno,p,h1,h2,h3,c,i,br,em,pre,code,ul,ol,li,dl,dt,dd]).+-type chunk_element_type() :: a | anno | p | c | i | br | em | pre |+                              code | ul | ol | li | dl | dt | dd.+-type chunk_element_attr() :: {atom(),unicode:chardata()}.+-type chunk_element_attrs() :: [chunk_element_attr()].+-type chunk_element() :: {chunk_element_type(),chunk_element_attrs(),+                          chunk_elements()} | binary().+-type chunk_elements() :: [chunk_element()].+-type docs_v1() :: #docs_v1{}.+++-spec validate(Module) -> ok when+      Module :: module() | docs_v1().+%% Simple validation of erlang doc chunk. Check that all tags are supported and+%% that the signature is correct.+validate(Module) when is_atom(Module) ->+    {ok, Doc} = code:get_doc(Module),+    validate(Doc);+validate(#docs_v1{ module_doc = MDocs, docs = AllDocs }) ->+    _ = maps:map(fun(_Key,MDoc) -> validate(binary_to_term(MDoc)) end, MDocs),+    lists:map(fun({_,_Anno, Sig, Docs, _Meta}) ->+                      case lists:all(fun erlang:is_binary/1, Sig) of+                          false -> throw({invalid_signature,Sig});+                          true -> ok+                      end,+                      maps:map(fun(_Key,Doc) -> validate(binary_to_term(Doc)) end, Docs)+              end, AllDocs);+validate([H|T]) when is_tuple(H) ->+    _ = validate(H),+    validate(T);+validate({Tag,Attr,Content}) ->+    case lists:member(Tag,?ALL_TAGS) of+        false ->+            throw({invalid_tag,Tag});+        true ->+            ok+    end,+    true = is_list(Attr),+    validate(Content);+validate([Chars | T]) when is_binary(Chars) ->+    validate(T);+validate([]) ->+    ok.++%% Follows algorithm described here:+%% * https://medium.com/@patrickbrosset/when-does-white-space-matter-in-html-b90e8a7cdd33+%% which in turn follows this:+%% * https://www.w3.org/TR/css-text-3/#white-space-processing+-spec normalize(Docs) -> NormalizedDocs when+      Docs :: chunk_elements(),+      NormalizedDocs :: chunk_elements().+normalize(Docs) ->+    Trimmed = normalize_trim(Docs,true),+    normalize_space(Trimmed).++normalize_trim(Bin,true) when is_binary(Bin) ->+    %% Remove any whitespace (except \n) before or after a newline+    NoSpace = re:replace(Bin,"[^\\S\n]*\n+[^\\S\n]*","\n",[global]),+    %% Replace any tabs with space+    NoTab = re:replace(NoSpace,"\t"," ",[global]),+    %% Replace any newlines with space+    NoNewLine = re:replace(NoTab,"\\v"," ",[global]),+    %% Replace any sequences of \s with a single " "+    re:replace(NoNewLine,"\\s+"," ",[global,{return,binary}]);+normalize_trim(Bin,false) when is_binary(Bin) ->+    Bin;+normalize_trim([{Tag,Attr,Content}|T],Trim) when Tag =:= pre;+                                                 Tag =:= code ->+    [{Tag,Attr,normalize_trim(Content,false)} | normalize_trim(T,Trim)];+normalize_trim([{Tag,Attr,Content}|T],Trim) ->+    [{Tag,Attr,normalize_trim(Content,Trim)} | normalize_trim(T,Trim)];+normalize_trim([<<>>|T],Trim) ->+    normalize_trim(T,Trim);+normalize_trim([B1,B2|T],Trim) when is_binary(B1),is_binary(B2) ->+    normalize_trim([<<B1/binary,B2/binary>> | T],Trim);+normalize_trim([H|T],Trim) ->+    [normalize_trim(H,Trim) | normalize_trim(T,Trim)];+normalize_trim([],_Trim) ->+    [].

Should we do this when writing to the chunk? 🤔 Otherwise every tool processing the chunk will have to do it.

garazdawi

comment created time in 7 hours

Pull request review commenterlang/otp

Introduce EEP-48 and help functions in shell

 where_is_file(Tail, File, Path, Files) ->             where_is_file(Tail, File)     end. +-spec get_doc(Mod) -> {ok, Res} | {error, Reason} when+      Mod :: module(),+      Res :: #docs_v1{},+      Reason :: non_existing | missing | file:posix().+get_doc(Mod) when is_atom(Mod) ->+    case which(Mod) of+        preloaded ->+            Fn = filename:join([code:lib_dir(erts),"ebin",atom_to_list(Mod) ++ ".beam"]),+            get_doc_chunk(Fn, Mod);+        Error when is_atom(Error) ->+            {error, Error};+        Fn ->+            get_doc_chunk(Fn, Mod)+    end.++get_doc_chunk(Filename, Mod) when is_atom(Mod) ->+    case beam_lib:chunks(Filename, ["Docs"]) of+        {error,beam_lib,{missing_chunk,_,_}} ->                +            case get_doc_chunk(Filename, atom_to_list(Mod)) of+                {error,missing} ->+                    get_doc_chunk_from_ast(Filename);+                Error ->+                    Error+            end;+        {error,beam_lib,{file_error,Filename,enoent}} ->+            get_doc_chunk(Filename, atom_to_list(Mod));+        {ok, {Mod, [{"Docs",Bin}]}} ->+            binary_to_term(Bin)+    end;+get_doc_chunk(Filename, Mod) ->+    case filename:dirname(Filename) of+        Filename ->+            {error,missing};+        Dir ->+            ChunkFile = filename:join([Dir,"doc","chunks",Mod ++ ".chunk"]),+            case file:read_file(ChunkFile) of+                {ok, Bin} ->+                    {ok, binary_to_term(Bin)};+                {error,enoent} ->+                    get_doc_chunk(Dir, Mod);+                {error,Reason} ->+                    {error,Reason}+            end+    end.++get_doc_chunk_from_ast(Filename) ->

I wonder if building a "fake chunk" from the AST on the fly is the way to go. Maybe code:get_doc/1 should remain truthful to what is actually in the chunk/disk and returned an error in cases like this. What are the cases the "fake chunk" is necessary? :)

garazdawi

comment created time in 7 hours

pull request commentelixir-lang/elixir

Multi-letter sigils

Hi @wojtekmach, thanks for the PR. :heart: Your implementation revealed conceptual problems that we were not aware of when simply discussing the approach, in particular:

  1. Allowing PID/Port/Reference to be built directly can lead to folks hardcoding references. Moving the sigils to IEx.Helpers can help tackle the problem but introduces its own issues too.

  2. Moving some data types inspection to sigils, such as URI, leads to loss of information. Today the inspected representation show all fields but the inspected one would hide them. Of course, this issue exists to any non-opaque struct, such as the calendar types, but this PR may make this issue more apparent.

  3. The sigil syntax is not general purpose enough. We already knew this was the case for MapSet and what not, but this implementation even revealed some challenges in the URI sigil implementation.

For all of the reasons above, we won't be moving forward with this. Perhaps the answer is not move towards sigils after at all. For example, a more general Decimal("1.0") and Version("1.0.0") could perhaps address all of these problems, including supporting MapSet, albeit slightly more verbose. But even that in itself would require further thoughts.

Thanks everyone for the feedback and back to the drawing board!

wojtekmach

comment created time in 11 hours

PR closed elixir-lang/elixir

Multi-letter sigils

Ref https://groups.google.com/d/msg/elixir-lang-core/QGU3ARQeL7w/AJMEtsYgBQAJ

+339 -208

8 comments

23 changed files

wojtekmach

pr closed time in 11 hours

issue commentdashbitco/broadway

Release v0.6.0

Thanks everyone, mission accomplished!

josevalim

comment created time in 11 hours

push eventphoenixframework/phoenix_live_view

José Valim

commit sha f54901d7570a81543442ed75fa6fe50104a9dfe0

Unify connect mount/handle_params handling

view details

push time in 13 hours

push eventelixir-lang/elixir

José Valim

commit sha af4d6c3731e3e5b1d27fbccb4d1165ec8011042d

Revert "Support `import Mod, only: :sigils` (#9822)" It will be moved to the current multi-letter sigils PR. This reverts commit 26adc8db3dddfb580f5c08a3d7468d3d3190f892.

view details

push time in 16 hours

pull request commentelixir-lang/elixir

Multi-letter sigils

@michalmuskala We have added warnings to the docs saying they need to be used carefully. Is that enough? Another option is to make them a macro and make them warn if used inside a module. So we can effectively use them only in IEx.

wojtekmach

comment created time in 17 hours

issue commentelixir-lang/elixir

New logger fails when `handle_sasl_reports` and `metadata: :all`

In any case, Thanks for the report!

brainlid

comment created time in a day

issue closedelixir-lang/elixir

New logger fails when `handle_sasl_reports` and `metadata: :all`

Environment

  • Elixir 1.10.1 (compiled with Erlang/OTP 22)
  • Operating system: Ubuntu LTS

Current behavior

Set config/prod.exs to something similar:

use Mix.Config

# Override for improved log output.
config :logger,
  backends: [:console],
  truncate: :infinity,
  compile_time_purge_matching: [
    [level_lower_than: :info]
  ],
  handle_otp_reports: true,
  handle_sasl_reports: true

config :logger, :console,
  format: "\n$time $metadata[$level] $levelpad$message\n",
  metadata: :all

Note that sasl logging is on and metadata is :all.

Then start the application:

MIX_ENV=prod mix phx.server

It blows up. The sasl reports include metadata that is a map, which is not handled or expected.

:gen_event handler Logger.Backends.Console installed in Logger terminating
** (exit) an exception was raised:
    ** (Protocol.UndefinedError) protocol String.Chars not implemented for %{tag: :info_report, type: :progress} of type Map. This protocol is implemented for the following type(s): ...
        (elixir 1.10.1) lib/string/chars.ex:3: String.Chars.impl_for!/1
        (elixir 1.10.1) lib/string/chars.ex:22: String.Chars.to_string/1
        (logger 1.10.1) lib/logger/formatter.ex:180: Logger.Formatter.metadata/1
        (logger 1.10.1) lib/logger/formatter.ex:180: Logger.Formatter.metadata/1
        (logger 1.10.1) lib/logger/formatter.ex:152: anonymous fn/6 in Logger.Formatter.format/5
        (elixir 1.10.1) lib/enum.ex:2111: Enum."-reduce/3-lists^foldl/2-0-"/3
        (logger 1.10.1) lib/logger/formatter.ex:151: Logger.Formatter.format/5
        (logger 1.10.1) lib/logger/backends/console.ex:186: Logger.Backends.Console.format_event/5

Expected behavior

Not blow up. Perhaps ignore the data or convert it to a string. I don't even care about that specific message. An example of the unhandled data is :logger_formatter and the value is %{title: 'PROGRESS REPORT'}

Why do I have :all?

I have a custom logger to format to JSON. I let all the metadata through and use a blacklist to block the ones I don't want. With the change to 1.10, this changed for me. Thought I should still report the issue as if the two options are enabled it breaks.

closed time in a day

brainlid

issue commentelixir-lang/elixir

New logger fails when `handle_sasl_reports` and `metadata: :all`

Duplicate of #9814. :)

brainlid

comment created time in a day

issue commentbeam-telemetry/telemetry

Add telemetry:span/3

I think wall clock can be computed on the handler. Most handlers want a timestamp on all events, so it is easier if the handler generates it instead of looking inside each event if they provide it or not.

José Valimhttps://dashbit.co/ https://dashbit.co/

josevalim

comment created time in a day

issue closeddashbitco/broadway_rabbitmq

Handle duplicated messages after connection lost

If the connection is lost after the message has been successfully processed but before been acknowledged, there's no way to acknowledge the message anymore since the acknowledgement is bound to the channel that has delivered the message. The documentation states that:

A channel only exists in the context of a connection and never on its own. When a connection is closed, so are all channels on it.

That means messages that were processed but not acknowledged will be requeued and processed more than once. However, the documentation also explains that:

If a message is delivered to a consumer and then requeued (because it was not acknowledged before the consumer connection dropped, for example) then RabbitMQ will set the redelivered flag on it when it is delivered again (whether to the same consumer or a different one). This is a hint that a consumer may have seen this message before (although that's not guaranteed, the message may have made it out of the broker but not into a consumer before the connection dropped)

That raises a couple of questions:

  1. Should broadway_rabbitmq have a builtin way to handle duplicated message due to a connection lost?
  2. Can we use the redelivered flag to avoid processing the message again? If so, how can we check if the redelivered message was previously successfully processed or not? The new message will have a different delivery_tag on a new channel which removes the possibility of comparing the messages. Is there another way?

closed time in a day

msaraiva

issue closeddashbitco/broadway_rabbitmq

Use a shared connection among Broadway producers

Currently, each Broadway producer opens and monitors it's own connection. This has the advantage of keeping producers completely independent from each other, each one maintaining its own state. However, RabbitMQ's documentation states that:

Each connection uses about 100 KB of RAM (and even more, if TLS is used). Thousands of connections can be a heavy burden on a RabbitMQ server. In the worst case, the server can crash due to out-of-memory. The AMQP protocol has a mechanism called channels that “multiplexes” a single TCP connection. It is recommended that each process only creates one TCP connection, and uses multiple channels in that connection for different threads.

So ideally, I believe we should have a separated process to maintain the connection. This process would be responsible for opening, monitoring and reopening the connection when necessary using the backoff strategy chosen by the user.

closed time in a day

msaraiva

issue commentdashbitco/broadway_rabbitmq

Use a shared connection among Broadway producers

I think we should close this as now we have functionality about custom opts per producer. We could introduce a separate producer that shares it, but it is definitely not the focus ATM.

msaraiva

comment created time in a day

issue commentbeam-telemetry/telemetry

Add telemetry:span/3

Actually, one argument for 1 is that a tracing/span library may want to have all monotonic times to align all events within. So perhaps keeping it monotonic_time and adding a note for people to not use it as start time and stop time is the best way to go.

josevalim

comment created time in a day

issue commentbeam-telemetry/telemetry

Add telemetry:span/3

Good catch!

@tsloughter mentioned that the system_time could be fetched from the handler/reporter itself. And I agree, almost all events will need a system_time, so it doesn't make sense to ask people to include it.

My proposal:

  1. we should either call these fields "monotonic_time" or
  2. hide them altogether, as they are bound to be misused

Thoughts?

josevalim

comment created time in a day

Pull request review commentteamon/tesla

Conform telemetry middleware to established patterns

 if Code.ensure_loaded?(:telemetry) do      end -    :telemetry.attach("my-tesla-telemetry", [:tesla, :request], fn event, time, meta, config ->+    :telemetry.attach("my-tesla-telemetry", [:tesla, :request, :stop], fn event, measurements, meta, config ->       # Do something with the event     end)     ``` +    ## Telemetry Events++    * `[:tesla, :request, :start]` - emitted at the beginning of the request.+      * Measurement: `%{time: System.monotonic_time}`

Moving the discussion to: https://github.com/beam-telemetry/telemetry/issues/57#issuecomment-587668574

bryannaegele

comment created time in a day

Pull request review commentteamon/tesla

Conform telemetry middleware to established patterns

 if Code.ensure_loaded?(:telemetry) do      end -    :telemetry.attach("my-tesla-telemetry", [:tesla, :request], fn event, time, meta, config ->+    :telemetry.attach("my-tesla-telemetry", [:tesla, :request, :stop], fn event, measurements, meta, config ->       # Do something with the event     end)     ``` +    ## Telemetry Events++    * `[:tesla, :request, :start]` - emitted at the beginning of the request.+      * Measurement: `%{time: System.monotonic_time}`

Maybe we should call these fields "monotonic_time" or hide them altogether, as they are bound to be misused. Thoughts?

bryannaegele

comment created time in a day

Pull request review commentteamon/tesla

Conform telemetry middleware to established patterns

 if Code.ensure_loaded?(:telemetry) do      end -    :telemetry.attach("my-tesla-telemetry", [:tesla, :request], fn event, time, meta, config ->+    :telemetry.attach("my-tesla-telemetry", [:tesla, :request, :stop], fn event, measurements, meta, config ->       # Do something with the event     end)     ``` +    ## Telemetry Events++    * `[:tesla, :request, :start]` - emitted at the beginning of the request.+      * Measurement: `%{time: System.monotonic_time}`

@tsloughter - correct. But then, theoretically speaking, we don't need to measure the duration either, because that could also be done in the handler. So where do we draw the line?

bryannaegele

comment created time in a day

issue commenterlef/eef-documentation-wg

Translate Erlang/OTP XML into EEP 48 docs chunk

@KennethL, @erszcz is interested in doing that and we even have some funds reserved by it. I think the main blocker was the internal representation choice for the Erlang/OTP XML and now that is mostly settled, we can resume the Edoc work. So once this is in, I see two different venues progressing at the same time:

  1. ExDoc integration with Erlang
  2. EDoc integration with EEP 48

I will schedule a working group meeting because if we are going to request funds, we need to convene and agree on the terms.

josevalim

comment created time in a day

pull request commenterlang/otp

Allow changing the -mode when the system restarts

Hi @rickard-green. I apologize if you are not the correct person to ping but is there any chance this can be looked at for inclusion in Erlang/OTP 23.0? This will provide a very meaningful speed up for Elixir releases. Thank you!

josevalim

comment created time in a day

issue commenterlef/eef-documentation-wg

Translate Erlang/OTP XML into EEP 48 docs chunk

The main unresolved issue is still what to do with the multiple function definitions. I'm starting to lean towards actually just re-writing the docs to not use that feature any more...

My current thinking is that EEP 48 will become more robust if we change it to support multiple entries. For Erlang and Elixir it boils down to an implementation detail but I can imagine a language in the future that may want to document each of them individually, especially statically typed language. In the worst case scenario, if we don't want to use the feature, it just ends up being a one item list. :)

But if you think it is best to save this fight for later when the need arises, then that's ok by me too!

Once the PR is available, we will start looking into giving it a try on ExDoc and give you some feedback.

josevalim

comment created time in a day

issue commentdashbitco/broadway

Bad Docs for testing

It sounds like a cache issue somewhere :/

fireproofsocks

comment created time in a day

issue commentdashbitco/broadway

Release v0.6.0

Broadway v0.6.0 is out, so now it is your turn again. :D

josevalim

comment created time in a day

created tagdashbitco/broadway

tagv0.6.0

Concurrent and multi-stage data ingestion and data processing with Elixir

created time in a day

push eventdashbitco/broadway

José Valim

commit sha a405918091675c706ebb8ed96355a19bc463a837

Add Kafka guide to listing

view details

push time in a day

push eventdashbitco/broadway

José Valim

commit sha b88fba6b859d5815c46d269fb317227c9fe07dfc

Update telemetry event to standard

view details

push time in a day

issue commenterlef/eef-documentation-wg

Translate Erlang/OTP XML into EEP 48 docs chunk

Fantastic! If there is anything I can do or if you want to jump on a call to discuss the unresolved issues, please let me know!

josevalim

comment created time in a day

issue commentdashbitco/broadway

Bad Docs for testing

I use the exact same Erlang/Elixir fro Homebrew but on Mojave and it worked. :(

fireproofsocks

comment created time in a day

issue commentdashbitco/broadway

Bad Docs for testing

@fireproofsocks the project compiled just fine using the mentioned Elixir version. What is your OTP version?

fireproofsocks

comment created time in a day

push eventphoenixframework/phoenix_live_view

José Valim

commit sha 29d494c078be64ffba82e2a93a18027daad13d6b

Component users should not rely on @flash directly

view details

push time in a day

push eventphoenixframework/phoenix

José Valim

commit sha bb1bc49e35c28170511d26343d042d1b7413c5b8

Clean up latest commit

view details

push time in a day

PR closed phoenixframework/phoenix_live_view

Add life cycle class and disable-with to click event

Resolve #49 and part of #89

Haven't generalize both life cycle class and disable-with yet since I'm not sure what @chrismccord mentioned about data-ref in #105 means.

Would you mind explaining? I would be happy to work on it :)

+8477 -5

1 comment

3 changed files

iboss-ptk

pr closed time in a day

pull request commentphoenixframework/phoenix_live_view

Add life cycle class and disable-with to click event

Hi, this has been merged in a separate PR. It took a while because it required refactoring the foundation. In any case, thanks for the PR!

iboss-ptk

comment created time in a day

issue commentphoenixframework/phoenix_live_view

Do not use phx_disconnected on live_redirect

Not necessary with the new loading events.

josevalim

comment created time in a day

Pull request review commentelixir-lang/elixir

Add checking of patterns in body

 defmodule Module.Types.Expr do   @moduledoc false -  def of_expr(_expr, _stack, context) do+  import Module.Types.{Helpers, Infer}+  import Module.Types.Pattern, only: [of_guard: 3, of_pattern: 3]++  # :atom+  def of_expr(atom, _stack, context) when is_atom(atom) do+    {:ok, {:atom, atom}, context}+  end++  # 12+  def of_expr(literal, _stack, context) when is_integer(literal) do+    {:ok, :integer, context}+  end++  # 1.2+  def of_expr(literal, _stack, context) when is_float(literal) do+    {:ok, :float, context}+  end++  # "..."+  def of_expr(literal, _stack, context) when is_binary(literal) do+    {:ok, :binary, context}+  end++  # fn -> ... end+  def of_expr(literal, _stack, context) when is_function(literal) do

I have pushed a commit that removes is_function from quoted.

ericmj

comment created time in a day

push eventelixir-lang/elixir

José Valim

commit sha 7962c6859adf38bd02ed7cbaf80bd44e1bea5904

Normalize literal functions during expansion

view details

push time in a day

Pull request review commentelixir-lang/elixir

Add checking of patterns in body

 defmodule Module.Types.Expr do   @moduledoc false -  def of_expr(_expr, _stack, context) do+  import Module.Types.{Helpers, Infer}+  import Module.Types.Pattern, only: [of_guard: 3, of_pattern: 3]++  # :atom+  def of_expr(atom, _stack, context) when is_atom(atom) do+    {:ok, {:atom, atom}, context}+  end++  # 12+  def of_expr(literal, _stack, context) when is_integer(literal) do+    {:ok, :integer, context}+  end++  # 1.2+  def of_expr(literal, _stack, context) when is_float(literal) do+    {:ok, :float, context}+  end++  # "..."+  def of_expr(literal, _stack, context) when is_binary(literal) do+    {:ok, :binary, context}+  end++  # fn -> ... end+  def of_expr(literal, _stack, context) when is_function(literal) do+    {:ok, :dynamic, context}+  end++  # #PID<...>+  def of_expr(literal, _stack, context) when is_pid(literal) do+    {:ok, :dynamic, context}+  end++  # #Reference<...>+  def of_expr(literal, _stack, context) when is_reference(literal) do+    {:ok, :dynamic, context}+  end++  # #Port<...>+  def of_expr(literal, _stack, context) when is_port(literal) do+    {:ok, :dynamic, context}+  end++  # <<...>>>+  def of_expr({:<<>>, _meta, args} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    result =+      reduce_ok(args, context, fn expr, context ->+        of_binary(expr, stack, context, &of_expr/3)+      end)++    case result do+      {:ok, context} -> {:ok, :binary, context}+      {:error, reason} -> {:error, reason}+    end+  end++  # left | []+  def of_expr({:|, _meta, [left_expr, []]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)+    of_expr(left_expr, stack, context)+  end++  # left | right+  def of_expr({:|, _meta, [left_expr, right_expr]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case of_expr(left_expr, stack, context) do+      {:ok, left, context} ->+        case of_expr(right_expr, stack, context) do+          {:ok, {:list, right}, context} ->+            {:ok, to_union([left, right], context), context}++          {:ok, right, context} ->+            {:ok, to_union([left, right], context), context}++          {:error, reason} ->+            {:error, reason}+        end++      {:error, reason} ->+        {:error, reason}+    end+  end++  # []+  def of_expr([], _stack, context) do+    {:ok, {:list, :dynamic}, context}+  end++  # [expr, ...]+  def of_expr(exprs, stack, context) when is_list(exprs) do+    stack = push_expr_stack(exprs, stack)++    case map_reduce_ok(exprs, context, &of_expr(&1, stack, &2)) do+      {:ok, types, context} -> {:ok, {:list, to_union(types, context)}, context}+      {:error, reason} -> {:error, reason}+    end+  end++  # __CALLER__+  def of_expr({:__CALLER__, _meta, var_context}, _stack, context) when is_atom(var_context) do+    # TODO: Full %Macro.Env{} struct+    {:ok, {:map, [{{:atom, :__struct__}, {:atom, Macro.Env}}]}, context}+  end++  # __STACKTRACE__+  def of_expr({:__STACKTRACE__, _meta, var_context}, _stack, context) when is_atom(var_context) do+    file = {:tuple, [{:atom, :file}, {:list, :integer}]}+    line = {:tuple, [{:atom, :line}, :integer]}+    file_line = {:list, {:union, [file, line]}}+    type = {:list, {:tuple, [:atom, :atom, :integer, file_line]}}+    {:ok, type, context}+  end++  # var+  def of_expr(var, _stack, context) when is_var(var) do+    {type, context} = new_var(var, context)+    {:ok, type, context}+  end++  # {left, right}+  def of_expr({left, right}, stack, context) do+    of_expr({:{}, [], [left, right]}, stack, context)+  end++  # {...}+  def of_expr({:{}, _meta, exprs} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case map_reduce_ok(exprs, context, &of_expr(&1, stack, &2)) do+      {:ok, types, context} -> {:ok, {:tuple, types}, context}+      {:error, reason} -> {:error, reason}+    end+  end++  # left = right+  def of_expr({:=, _meta, [left_expr, right_expr]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    with {:ok, left_type, context} <- of_pattern(left_expr, stack, context),+         {:ok, right_type, context} <- of_expr(right_expr, stack, context),+         do: unify(right_type, left_type, stack, context)+  end++  # %{map | ...}+  def of_expr({:%{}, _, [{:|, _, [_map, args]}]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case of_pairs(args, stack, context) do+      {:ok, _pairs, context} -> {:ok, {:map, []}, context}+      {:error, reason} -> {:error, reason}+    end+  end++  # %Struct{map | ...}+  def of_expr({:%, _, [module, {:%{}, _, [{:|, _, [_map, args]}]}]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case of_pairs(args, stack, context) do+      {:ok, _pairs, context} ->+        pairs = [{{:atom, :__struct__}, {:atom, module}}]+        {:ok, {:map, pairs}, context}++      {:error, reason} ->+        {:error, reason}+    end+  end++  # %{...}+  def of_expr({:%{}, _meta, args} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case of_pairs(args, stack, context) do+      {:ok, pairs, context} -> {:ok, {:map, pairs_to_unions(pairs, context)}, context}+      {:error, reason} -> {:error, reason}+    end+  end++  # %Struct{...}+  def of_expr({:%, _meta1, [module, {:%{}, _meta2, args}]} = expr, stack, context) do+    stack = push_expr_stack(expr, stack)++    case of_pairs(args, stack, context) do+      {:ok, pairs, context} ->+        pairs = [{{:atom, :__struct__}, {:atom, module}} | pairs]+        {:ok, {:map, pairs}, context}++      {:error, reason} ->+        {:error, reason}+    end+  end++  # ()+  def of_expr({:__block__, _meta, []}, _stack, context) do+    {:ok, {:atom, nil}, context}+  end++  # (expr; expr)+  def of_expr({:__block__, _meta, exprs}, stack, context) do+    # Clear expression stack here?
    # TODO: Clear expression stack here?

?

ericmj

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Add checking of patterns in body

 defmodule Module.Types.Expr do   @moduledoc false -  def of_expr(_expr, _stack, context) do+  import Module.Types.{Helpers, Infer}+  import Module.Types.Pattern, only: [of_guard: 3, of_pattern: 3]++  # :atom+  def of_expr(atom, _stack, context) when is_atom(atom) do+    {:ok, {:atom, atom}, context}+  end++  # 12+  def of_expr(literal, _stack, context) when is_integer(literal) do+    {:ok, :integer, context}+  end++  # 1.2+  def of_expr(literal, _stack, context) when is_float(literal) do+    {:ok, :float, context}+  end++  # "..."+  def of_expr(literal, _stack, context) when is_binary(literal) do+    {:ok, :binary, context}+  end++  # fn -> ... end+  def of_expr(literal, _stack, context) when is_function(literal) do+    {:ok, :dynamic, context}+  end++  # #PID<...>+  def of_expr(literal, _stack, context) when is_pid(literal) do

I think we can return :pid here but reference and port are not valid AST nodes, so you can remove the two clauses below. :)

ericmj

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Add checking of patterns in body

 defmodule Kernel.SpecialForms do       iex> <<"foo"::utf32>>       <<0, 0, 0, 102, 0, 0, 0, 111, 0, 0, 0, 111>> +  Otherwise we get an `ArgumentError` when construcing the binary:++      > rest = "oo"+      > <<102, rest>>
      <<102, rest>>
ericmj

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Add checking of patterns in body

 defmodule Kernel.SpecialForms do       iex> <<"foo"::utf32>>       <<0, 0, 0, 102, 0, 0, 0, 111, 0, 0, 0, 111>> +  Otherwise we get an `ArgumentError` when construcing the binary:++      > rest = "oo"
      rest = "oo"
ericmj

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.++  This function allows creating arbitrary PIDs, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~PID<0.108.0>+      ~PID<0.108.0>++  """+  @doc since: "1.11.0"+  def sigil_PID(string, [] = _modifiers) do+    :erlang.list_to_pid('<#{string}>')+  end++  @doc """+  Handles the `~Port` sigil.++  This function allows creating arbitrary ports, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Port<0.6>+      ~Port<0.6>++  """+  @doc since: "1.11.0"+  def sigil_Port(string, [] = _modifiers) do+    :erlang.list_to_port('#Port<#{string}>')+  end++  @doc """+  Handles the `~Reference` sigil.++  This function allows creating arbitrary references, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Reference<0.2283498464.2022703108.246828>+      ~Reference<0.2283498464.2022703108.246828>++  """+  @doc since: "1.11.0"+  def sigil_Reference(string, [] = _modifiers) do+    :erlang.list_to_ref('#Ref<#{string}>')+  end++  @doc """+  Handles the `~URI` sigil.++  See `URI.parse/1` for more information.++  ## Examples++      iex> ~URI<https://elixir-lang.org>+      ~URI<https://elixir-lang.org>++  """+  @doc since: "1.11.0"+  defmacro sigil_URI({:<<>>, _, [string]} = _uri_string, [] = _modifiers) do+    Macro.escape(URI.parse(string))+  end++  @doc """+  Handles the `~Version` sigil.++  See `Version.parse!/1` for more information.++  ## Examples++      iex> ~Version<2.0.1-alpha1>+      ~Version<2.0.1-alpha1>++  """+  @doc since: "1.11.0"+  defmacro sigil_Version({:<<>>, _, [string]} = _version_string, [] = _modifiers) do+    Macro.escape(Version.parse!(string))+  end

Yeah! It doesn't explain why we have sigils for Regex or Date/Time though. :/

So the rule is like "it has to be a built-in type and use a multi-letter sigil" and that relies on users knowing what is a built-in type and there are probably some gotchas still. I think it is easier to go full-Oprah on this and say "you get a Kernel sigil", "you get a Kernel sigil" and "you get a Kernel sigil".

wojtekmach

comment created time in 2 days

issue closedphoenixframework/phoenix_live_view

LiveView rendered via a controller raises error

Environment

  • Elixir version (elixir -v): 1.10.0
  • Phoenix version (mix deps): 1.4.13
  • Phoenix LiveView version (mix deps): 0.7.1
  • NodeJS version (node -v): 10.18.1
  • NPM version (npm -v): 6.13.4
  • Operating system: Linux
  • Browsers you attempted to reproduce this bug on (the more the merrier): Chromium 79.0.3945.130 (Build) (64 bit)
  • Does the problem persist after removing "assets/node_modules" and trying again? Yes/no: yes

Actual behavior

So, the story is that I'm trying the rendering of one only LiveView via a Controller because i need to send a 404 in some cases and I didn't know how to do it in another way. Anyway, that LiveView was previously routed using live so it has an handle_params function defined and is rendered with this line:

    live_render(conn, Journey.Manage.TableLive, router: Journey.Manage.Router,
      session: %{"params" => params})

so with a router option defined because I wanted to generate paths for it.

Trying out the view raises this error

13:31:16.478 [error] GenServer #PID<0.980.0> terminating
** (RuntimeError) cannot invoke handle_params/3 for Journey.Manage.TableLive because Journey.Manage.TableLivewas not mounted at the router with the live/3 macro under URL "http://localhost:4001/cont/location/page/3"
    (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:203: Phoenix.LiveView.Channel.maybe_call_mount_handle_params/4
    (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:604: Phoenix.LiveView.Channel.verified_mount/4
    (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:34: Phoenix.LiveView.Channel.handle_info/2
    (stdlib 3.10) gen_server.erl:637: :gen_server.try_dispatch/4
    (stdlib 3.10) gen_server.erl:711: :gen_server.handle_msg/6
    (stdlib 3.10) proc_lib.erl:249: :proc_lib.init_p_do_apply/3
Last message: {:mount, Phoenix.LiveView.Channel}
State: {%{"joins" => 0, "params" => %{"_csrf_token" => "<a token>"}, "session" => "<a session>", "static" => "<a static>", "url" => "http://localhost:4001/cont/location/page/3"}, {#PID<0.976.0>, #Reference<0.2186393052.255590402.119379>}, %Phoenix.Socket{assigns: %{}, channel: Phoenix.LiveView.Channel, channel_pid: nil, endpoint: Journey.Manage.Endpoint, handler: Phoenix.LiveView.Socket, id: nil, join_ref: "4", joined: false, private: %{session: %{"_csrf_token" => "Jx6sy-47xywhm5eN_9T2EsJT"}}, pubsub_server: Journey.Manage.PubSub, ref: nil, serializer: Phoenix.Socket.V2.JSONSerializer, topic: "lv:phx-FfR_EBHS3cBWbQMh", transport: :websocket, transport_pid: #PID<0.976.0>}}
13:31:16.478 [error] an exception was raised:
    ** (RuntimeError) cannot invoke handle_params/3 for Journey.Manage.TableLive because Journey.Manage.TableLivewas not mounted at the router with the live/3 macro under URL "http://localhost:4001/cont/location/page/3"
        (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:203: Phoenix.LiveView.Channel.maybe_call_mount_handle_params/4
        (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:604: Phoenix.LiveView.Channel.verified_mount/4
        (phoenix_live_view 0.7.1) lib/phoenix_live_view/channel.ex:34: Phoenix.LiveView.Channel.handle_info/2
        (stdlib 3.10) gen_server.erl:637: :gen_server.try_dispatch/4
        (stdlib 3.10) gen_server.erl:711: :gen_server.handle_msg/6
        (stdlib 3.10) proc_lib.erl:249: :proc_lib.init_p_do_apply/3

Expected behavior

At least that the handle_params function is ignored and the error not raised

closed time in 2 days

azazel75

issue commentphoenixframework/phoenix_live_view

LiveView rendered via a controller raises error

So, the story is that I'm trying the rendering of one only LiveView via a Controller because i need to send a 404 in some cases and I didn't know how to do it in another way

You can have a plug in your router that handles these cases or you can raise an exception on LiveView mount and make the exception be treated as 404 by implementing Plug.Exception.

We may remove this error in the future but for now we want to push folks to declare their LiveView in the router as much as possible.

Thanks!

azazel75

comment created time in 2 days

issue commentphoenixframework/phoenix_live_view

Hooks - updated() doesn't affect elements style - JCF

Can you please provide a sample application that reproduces the error? This will speed up the fixing process considerably. Otherwise it is unclear when we would have to time to put into reproducing the issue itself. Thanks!

Kaquadu

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.++  This function allows creating arbitrary PIDs, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~PID<0.108.0>+      ~PID<0.108.0>++  """+  @doc since: "1.11.0"+  def sigil_PID(string, [] = _modifiers) do+    :erlang.list_to_pid('<#{string}>')+  end++  @doc """+  Handles the `~Port` sigil.++  This function allows creating arbitrary ports, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Port<0.6>+      ~Port<0.6>++  """+  @doc since: "1.11.0"+  def sigil_Port(string, [] = _modifiers) do+    :erlang.list_to_port('#Port<#{string}>')+  end++  @doc """+  Handles the `~Reference` sigil.++  This function allows creating arbitrary references, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Reference<0.2283498464.2022703108.246828>+      ~Reference<0.2283498464.2022703108.246828>++  """+  @doc since: "1.11.0"+  def sigil_Reference(string, [] = _modifiers) do+    :erlang.list_to_ref('#Ref<#{string}>')+  end++  @doc """+  Handles the `~URI` sigil.++  See `URI.parse/1` for more information.++  ## Examples++      iex> ~URI<https://elixir-lang.org>+      ~URI<https://elixir-lang.org>++  """+  @doc since: "1.11.0"+  defmacro sigil_URI({:<<>>, _, [string]} = _uri_string, [] = _modifiers) do+    Macro.escape(URI.parse(string))+  end++  @doc """+  Handles the `~Version` sigil.++  See `Version.parse!/1` for more information.++  ## Examples++      iex> ~Version<2.0.1-alpha1>+      ~Version<2.0.1-alpha1>++  """+  @doc since: "1.11.0"+  defmacro sigil_Version({:<<>>, _, [string]} = _version_string, [] = _modifiers) do+    Macro.escape(Version.parse!(string))+  end

Let's separate these discussion. My main concern is: can you explain why ~Version is not in Kernel but ~Port is? Why is ~r in Kernel and not ~URI?

There is one heuristic I could come up with to explain it but it is so long that I am lazy of even typing it. :P

wojtekmach

comment created time in 2 days

push eventelixir-lang/elixir-lang.github.com

José Valim

commit sha 69db9682a66c4e8ea396bf1fdf3800981b90a9a8

Tidy up runtime config docs, closes #1370

view details

push time in 2 days

issue closedelixir-lang/elixir-lang.github.com

Unclear how to setup routing table in "Assembling multiple releases"

I just finished working through the Mix and OTP guide. At the end of the "Assembling multiple releases" section of "Configuration and releases" page its unclear on how to setup the routing table to get the foo and bar releases to work, so the app is left in an unworking state. The "Configuring releases" section alludes to reading a routing table from disk but I'm still unsure how to proceed.

closed time in 2 days

abstractcoder

issue commentelixir-lang/elixir-lang.github.com

Unclear how to setup routing table in "Assembling multiple releases"

I don't think configuring the routing table is a good idea because you can end-up with cycles and what not but here is how you would do this. Your config/releases.exs would be written like this:

import Config
{table, _} = Code.eval_file("routing_table_from_disk.exs")
config :kv, :routing_table, table

Where "routing_table_from_disk.exs" is a file at the root of your release. You can add files at the root of your release using rel/overlays available in Elixir v1.10. Add this:

# rel/overlays/routing_table_from_disk.exs
[
  {?a..?m, :"foo@computer-name"},
  {?n..?z, :"bar@computer-name"}
]

And you should be good to go! I will simplify that section for now. Thanks!

abstractcoder

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule IEx.Helpers do   ## Examples        iex> ref("0.1.2.3")-      #Reference<0.1.2.3>+      ~Reference<0.1.2.3>    """   @doc since: "1.6.0"+  # TODO: hard-deprecate in favour of ~Reference sigil in Elixir v1.13

We should @doc false them too.

wojtekmach

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule IEx.Helpers do   ## Examples        iex> port("0.4")-      #Port<0.4>+      ~Port<0.4>    """   @doc since: "1.8.0"+  # TODO: deprecate in favour of ~Port sigil
  # TODO: deprecate in favour of ~Port sigil in v1.13
wojtekmach

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule IEx.Helpers do   ## Examples        iex> pid("0.21.32")-      #PID<0.21.32>+      ~PID<0.21.32>    """+  # TODO: deprecate in favour of ~PID sigil
  # TODO: deprecate in favour of ~PID sigil in v1.13
wojtekmach

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.++  This function allows creating arbitrary PIDs, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~PID<0.108.0>+      ~PID<0.108.0>++  """+  @doc since: "1.11.0"+  def sigil_PID(string, [] = _modifiers) do+    :erlang.list_to_pid('<#{string}>')+  end++  @doc """+  Handles the `~Port` sigil.++  This function allows creating arbitrary ports, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Port<0.6>+      ~Port<0.6>++  """+  @doc since: "1.11.0"+  def sigil_Port(string, [] = _modifiers) do+    :erlang.list_to_port('#Port<#{string}>')+  end++  @doc """+  Handles the `~Reference` sigil.++  This function allows creating arbitrary references, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.++  ## Examples++      iex> ~Reference<0.2283498464.2022703108.246828>+      ~Reference<0.2283498464.2022703108.246828>++  """+  @doc since: "1.11.0"+  def sigil_Reference(string, [] = _modifiers) do+    :erlang.list_to_ref('#Ref<#{string}>')+  end++  @doc """+  Handles the `~URI` sigil.++  See `URI.parse/1` for more information.++  ## Examples++      iex> ~URI<https://elixir-lang.org>+      ~URI<https://elixir-lang.org>++  """+  @doc since: "1.11.0"+  defmacro sigil_URI({:<<>>, _, [string]} = _uri_string, [] = _modifiers) do+    Macro.escape(URI.parse(string))+  end++  @doc """+  Handles the `~Version` sigil.++  See `Version.parse!/1` for more information.++  ## Examples++      iex> ~Version<2.0.1-alpha1>+      ~Version<2.0.1-alpha1>++  """+  @doc since: "1.11.0"+  defmacro sigil_Version({:<<>>, _, [string]} = _version_string, [] = _modifiers) do+    Macro.escape(Version.parse!(string))+  end

So I have thought more about having this in Kernel vs in the respective modules, at least for non built-in types. The "issue" is that we already have many built-in types with sigils in Kernel, so I think it is best to keep all of them there, otherwise the line is arbitrary.

The other issue with the current implementation is that it is not "plug and play". For example, if something returns ~Decimal<1.0>, i cannot simply use that in my code, unless the sigil is imported. This UI can be confusing. So my suggestion would be for us to, when failing to compiling due to a non-existing local function, we could check if:

  1. If the function is a sigil starting with uppercase

  2. If so, if there is a module with the name of the sigil and that exports a sigil_NAME function or macro. If so, the error message will add: "Perhaps you meant to "import Decimal, only: :sigils" before using it?" or similar

Thoughts?

wojtekmach

comment created time in 2 days

Pull request review commentphoenixframework/phoenix_live_view

Add data-ref with css lifecycle classes

 defmodule Phoenix.LiveView.Channel do         end       end) -    new_socket = Utils.merge_flash(socket, flash)+    new_socket = if redirected, do: Utils.merge_flash(socket, flash), else: socket

Is this correct? If a component redirects, don't we want to merge its flash into the parent? 🤔

chrismccord

comment created time in 2 days

Pull request review commentphoenixframework/phoenix_live_view

Add data-ref with css lifecycle classes

 defmodule Phoenix.LiveView do       class will be applied in conjunction with `"phx-disconnected"` if connection       to the server is lost. +  All `phx-` event bindings apply their own css classes when pushed. For example+  the following markup:++      <button phx-click="clicked" phx-window-keydown="key">...</button>++  In the case of forms, when a `phx-change` is sent to the server, the input element+  which emitted the change receives the `phx-change-loading` class, along wiht the+  parent form tag.++  On click, would receive the `phx-click-loading` class, and on keydown would receive+  the `phx-keydown-loading` class. The css loading classes are maintained until an+  acknowledgement is received on the client for the pushed event. The following events+  receive css loadng classes:++    - `phx-click` - `phx-click-loading`+    - `phx-change` - `phx-change-loading`+    - `phx-submit` - `phx-submit-loading`+    - `phx-focus` - `phx-focus-loading`+    - `phx-blur` - `phx-blur-loading`+    - `phx-window-keydown` - `phx-keydown-loading`+    - `phx-window-keyup` - `phx-keyup-loading`++  For live page navigation via `live_redirect` and `live_patch`, as well as form+  submits via `phx-submit`, the JavaScript events `"phx:page-loading-start"` and+  `"phx:page-loading-stop"` are dispatched on window. Additionally, any `phx-`+  event may dispatch page loading events by annotating the DOM element with+  `phx-page-loading`. This is useful for showing main page loading status, for example:++      // app.js+      import NProgress from "nprogress"+      window.addEventListener("phx:page-loading-start", info => NProgress.start())+      window.addEventListener("phx:page-loading-stop", info => NProgress.done())++  The `info` object will contain a `kind` key, with values one of:++    - `"redirect"` - the event was triggered by a redirect+    - `"patch"` - the event was triggered by a patch+    - `"initial"` - the event was triggered by initial page load+    - `"element"` - the event was triggered by a `phx-` bound element, such as `phx-click`

We are missing "submit" from this list or does it fall under "element"?

chrismccord

comment created time in 2 days

Pull request review commentphoenixframework/phoenix_live_view

Add data-ref with css lifecycle classes

   background: #ffe6f0!important; } +/* Loading states */+.phx-click-loading {

Also, should we delete this whole CSS file and just push people to use NProgress?

chrismccord

comment created time in 2 days

Pull request review commentphoenixframework/phoenix_live_view

Add data-ref with css lifecycle classes

   background: #ffe6f0!important; } +/* Loading states */+.phx-click-loading {

We can remove these classes, right? They were just for trying things out?

chrismccord

comment created time in 2 days

issue commentphoenixframework/phoenix_live_view

Question: Is it possible for a LiveView to efficiently display and update lists?

@devshahani today there is a very simple solution. Since each element inside an append needs an ID, you can delete anything by sending the container with an ID and empty contents:

<div id="id_you_want_to_delete" class="deleted">

Whatever we add will simply be an optimization to fully remove it from DOM.

jwietelmann

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.++  This function allows creating arbitrary PIDs, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.+  """+  @doc since: "1.11.0"+  def sigil_PID(string, modifiers)++  def sigil_PID(string, []) do+    :erlang.list_to_pid('<#{string}>')+  end++  @doc """+  Handles the `~Port` sigil.++  This function allows creating arbitrary ports, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.+  """+  @doc since: "1.11.0"+  def sigil_Port(string, modifiers)++  def sigil_Port(string, []) do+    :erlang.list_to_port('~Port<#{string}>')+  end++  @doc """+  Handles the `~Reference` sigil.++  This function allows creating arbitrary references, even the ones that don't actually exist in the VM.+  As such, it should only be used for debugging purposes.+  """+  @doc since: "1.11.0"+  def sigil_Reference(string, modifiers)++  def sigil_Reference(string, []) do+    :erlang.list_to_ref('#Ref<#{string}>')+  end++  @doc """+  Handles the `~URI` sigil.++  See `URI.parse/1` for more information.+  """+  @doc since: "1.11.0"+  defmacro sigil_URI(string, modifiers)++  defmacro sigil_URI({:<<>>, _, [string]}, []) do+    Macro.escape(URI.parse(string))+  end++  @doc """+  Handles the `~Version` sigil.++  See `Version.parse!/1` for more information.+  """+  @doc since: "1.11.0"+  defmacro sigil_Version(string, modifiers)++  defmacro sigil_Version({:<<>>, _, [string]}, []) do+    Macro.escape(Version.parse!(string))+  end

This is a good point. On one hand it would be convenient to have them in Kernel but at the same time it would be fairer to have them imported, as libraries outside of Elixir would also require. Does anyone have any further thoughts?

wojtekmach

comment created time in 2 days

pull request commentelixir-lang/elixir

Use simpler example for doctests and exceptions

:green_heart: :blue_heart: :purple_heart: :yellow_heart: :heart:

wojtekmach

comment created time in 2 days

push eventelixir-lang/elixir

Wojtek Mach

commit sha 9d74c6a718df0ba827aa53dd9a906c1354f05241

Use simpler example for doctests and exceptions (#9827)

view details

push time in 2 days

Pull request review commentteamon/tesla

Conform telemetry middleware to established patterns

 if Code.ensure_loaded?(:telemetry) do      end -    :telemetry.attach("my-tesla-telemetry", [:tesla, :request], fn event, time, meta, config ->+    :telemetry.attach("my-tesla-telemetry", [:tesla, :request, :stop], fn event, measurements, meta, config ->       # Do something with the event     end)     ``` +    ## Telemetry Events++    * `[:tesla, :request, :start]` - emitted at the beginning of the request.+      * Measurement: `%{time: System.monotonic_time}`+      * Metadata: `%{env: Tesla.Env.t()}`++    * `[:tesla, :request, :stop]` - emitted at the end of the request.+      * Measurement: `%{duration: native_time}`+      * Metadata: `%{env: Tesla.Env.t()} | %{env: Tesla.Env.t, error: term}`++    * `[:tesla, :request, :fail]` - emitted when there is an error.

Sorry but I just proposed for us to name this :failure instead of :fail. I noticed :fail reads a bit weird when implementing the same on Broadway. Please let me know your thoughts. :)

bryannaegele

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Use simpler example for doctests and exceptions

 defmodule ExUnit.DocTest do    You can also showcase expressions raising an exception, for example: -      iex(1)> String.to_atom((fn -> 1 end).())-      ** (ArgumentError) argument error+      iex(1)> raise "some error"+      ** (RuntimeErroro) some error

Make sure this won't emit warnings when the suite is running. :)

wojtekmach

comment created time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 end  defimpl Inspect, for: Version do   def inspect(self, _opts) do-    "#Version<" <> to_string(self) <> ">"+    "~Version<" <> to_string(self) <> ">"

Yes, whenever we change inspected representations, tests may break. However, inspected representations are allowed to change, because the primary goal is developer readability and information. I would say this is even a pro of this PR, because we will push people towards a programmatic representation instead of a textual one.

wojtekmach

comment created time in 2 days

pull request commentphoenixframework/phoenix

Allow setting mfa for loading socket session config

:green_heart: :blue_heart: :purple_heart: :yellow_heart: :heart:

bruteforcecat

comment created time in 2 days

push eventphoenixframework/phoenix

KaFai

commit sha 229cd354f780e3f15bca14228e2cd3cd01d14472

Allow setting mfa for loading socket session config (#3668)

view details

push time in 2 days

PR merged phoenixframework/phoenix

Allow setting mfa for loading socket session config

The purpose of this PR is to allow loading session config through MFA in Phoenix.Endpoint.socket. For a more detail info, please check our this issue

+92 -5

0 comment

3 changed files

bruteforcecat

pr closed time in 2 days

issue closedphoenixframework/phoenix

Missing capability of loading runtime config for Phoenix.Endpoint.socket

Environment

  • Elixir version (elixir -v): 1.9.1
  • Phoenix version (mix deps): 1.4.12
  • NodeJS version (node -v): 11.13.0
  • NPM version (npm -v): 6.13.6
  • Operating system: macOS Catalina

Expected behavior

Currently there is no way to set config for Phoenix.Endpoint.socket/3 in runtime. And this lead to a problem of not able to let socket fetching session if config for Plug.Session is set in runtime like using Plug.builder_opts.

My particular example is I set Plug.Session config runtime by loading up signing_key and encryption_key from env var. However because of above issue, it looks there is no way to let Phoenix.Endpoint and Phoenix.Socket.Transport to fetch session with correct session config for Liveview socket(https://github.com/phoenixframework/phoenix_live_view/blob/master/guides) /introduction/installation.md) as config are assumed to be set only in compile time.

Actual behavior

User should be able to set Phoenix.Endpoint.socket/3 config in runtime, like Plug. I wonder if it makes sense to allow passing function with arity 0 to session like the following

  socket "/live", Phoenix.LiveView.Socket,
      websocket: [connect_info: [session: load_session/0]],
      longpoll: [connect_info: [session: load_session/0]]

I would like to hear what do you guys think. I am also willing to help on this one if you think this is something we should support

closed time in 2 days

bruteforcecat

issue commentphoenixframework/phoenix

Missing capability of loading runtime config for Phoenix.Endpoint.socket

Closing in favor of PR. Thanks!

bruteforcecat

comment created time in 2 days

pull request commentphoenixframework/phoenix

Fix Phoenix.Ecto.CheckRepoStatus name in changelog

:green_heart: :blue_heart: :purple_heart: :yellow_heart: :heart:

novaugust

comment created time in 2 days

push eventphoenixframework/phoenix

Matt Enlow

commit sha 3666c150954bca1d8c22d59a85b54dd409f687c5

Fix Phoenix.Ecto.CheckRepoStatus name in changelog (#3673)

view details

push time in 2 days

PR merged phoenixframework/phoenix

Fix Phoenix.Ecto.CheckRepoStatus name in changelog

Noticed this typo while keeping an eye on 1.5 changes :)

+1 -1

0 comment

1 changed file

novaugust

pr closed time in 2 days

pull request commentphoenixframework/phoenix

Replace includes with indexOf in phoenix.js

:green_heart: :blue_heart: :purple_heart: :yellow_heart: :heart:

indrekj

comment created time in 2 days

push eventphoenixframework/phoenix

Indrek Juhkam

commit sha 5c61cd64070f64e1eb601d36941250c6823df507

Replace includes with indexOf in phoenix.js (#3667) IE11 does not support Array.prototype.includes function. Instead use indexOf.

view details

push time in 2 days

PR merged phoenixframework/phoenix

Replace includes with indexOf in phoenix.js kind:enhancement

Sadly we have to support IE11 which does not support Array.prototype.includes() . The fix itself it quite simple: instead of includes we can use indexOf.

+1 -1

3 comments

1 changed file

indrekj

pr closed time in 2 days

issue commentbeam-telemetry/telemetry

Add telemetry:span/3

So I would like to go ahead and propose for us to standardize on failure. Reviewed "spec":

  • A "start" event is emitted at the beginning of every span
  • Then ONE OF "stop" and "failure" will be emitted:
    • "stop" is emitted if the function did not throw, error or exit
    • "failure" is emitted if the function did throw, error or exit
  • If the process crashes due to a link, then none of "stop" or "failure" may be emitted
josevalim

comment created time in 2 days

push eventphoenixframework/phoenix_live_view

José Valim

commit sha 2efdc70868319ca322bf8897cc418aa891250675

Update bug_report.md

view details

push time in 2 days

push eventphoenixframework/phoenix_live_view

José Valim

commit sha 1005dc9214e4dacee45bfe9e331327fe5bf5d2a3

Update bug_report.md

view details

push time in 2 days

issue closedphoenixframework/phoenix_live_view

Unify LiveView flash and Phoenix flash in layouts

  • Changes
    • [x] Refactor Phoenix.LiveView.Flash to directly set assigns instead of calling fetch_flash again (requires Phoenix v1.5)
  • Test scenarios
    • [x] redirect from controller to live_view with hard link
    • [x] redirect from live_view to live_view with hard link
    • [x] redirect from live_view to controller with hard link
    • [x] Flash within live redirect
    • [x] Flash within live patch

closed time in 2 days

josevalim

push eventphoenixframework/phoenix_live_view

José Valim

commit sha 2d6c6e0c3bd6010588ce524cda6802c3218dba1c

Fix flash pending TODOs

view details

push time in 2 days

push eventphoenixframework/phoenix

José Valim

commit sha 3fdeca82cb88b366a805792d8dee04f3a79c4cc1

Ensure fetch_flash only runs once and add merge_flash

view details

push time in 2 days

push eventphoenixframework/phoenix

José Valim

commit sha 9fe4c018d346d3528adf432227a99cbd48965070

Ensure fetch_flash only runs once and add merge_flash

view details

push time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.+  """

Your call. The clearer we can be, the better.

wojtekmach

comment created time in 2 days

pull request commentelixir-lang/elixir

Multi-letter sigils

In the docs for the Macro module, we should note that Elixir originally only had single-letter sigils and multi-letter ones were added on v1.11.0 and we encourage developers to use the multi-letter one.

wojtekmach

comment created time in 2 days

pull request commentelixir-lang/elixir

Multi-letter sigils

We need to update the docs for the Macro module and the Syntax Reference page.

wojtekmach

comment created time in 2 days

push eventelixir-lang/elixir

José Valim

commit sha 4de2cd7d95ad3a3c711d6f734554e94982b947a3

Clarify inspect docs

view details

push time in 2 days

Pull request review commentelixir-lang/elixir

Multi-letter sigils

 defmodule Kernel do     raise ArgumentError, "modifier must be one of: s, a, c"   end +  @doc """+  Handles the `~PID` sigil.+  """

We should probably add a note about hardcoding PID values. Same for Port and Reference. The idea is that you can use these sigils for copy and pasting while debugging, but PIDs are not guaranteed to be the same between VM runs.

wojtekmach

comment created time in 2 days

more