profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/ItalyPaleAle/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Alessandro (Ale) Segala ItalyPaleAle Microsoft Corporation Seattle, WA https://withblue.ink

ItalyPaleAle/amaretto-osx 2

Amaretto is a framework and a boilerplate to create OSX apps using HTML5/JavaScript that interact with native code.

ItalyPaleAle/azure-guacamole 2

Guacamole RDP/VNC client - Azure Resource Manager template

ItalyPaleAle/calendar-next-demo 1

"Next on my calendar" demo app: static app interacting with the Office 365 calendar APIs

ItalyPaleAle/actions 0

Automate your GitHub workflows using Azure Actions

ItalyPaleAle/aes-kw 0

aes kw algorithm as per RFC 3394

ItalyPaleAle/age 0

A simple, modern and secure encryption tool (and Go library) with small explicit keys, no config options, and UNIX-style composability.

ItalyPaleAle/apollo-server 0

:earth_africa: GraphQL server for Express, Connect, Hapi and Koa

ItalyPaleAle/appservice-actions 0

Enable GitHub developers to deploy to Azure WebApps using GitHub Actions

PublicEvent

created tagItalyPaleAle/arraybuffer

tagv1.0.1

NPM: arraybuffer-encoding

created time in 2 days

release ItalyPaleAle/arraybuffer

v1.0.1

released time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 3bf5d1f7b6fb9a21d47459adcf9c03887454ab32

Docs update

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 77d8735d3f8200a441bc96e3500b87d6fad0376c

Fixed exports paths

view details

push time in 2 days

release ItalyPaleAle/arraybuffer

v1.0.0

released time in 2 days

created tagItalyPaleAle/arraybuffer

tagv1.0.0

NPM: arraybuffer-encoding

created time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha db0273ffcff37646ceb30fc07d4b4fd0255775da

Docs update

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha e7ebd001cb2894d50dfcd20f9c056e0ef822916f

Link to docs

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 587bcfee6567be11d17bcaa5f8172860c29b5900

Added more badges

view details

Alessandro Segala (ItalyPaleAle)

commit sha 24bbd636efc40b85d8d98653aa42e9a0904bcf59

Better comments. Also, base64/index.ts exports Url and Standard as properties too

view details

Alessandro Segala (ItalyPaleAle)

commit sha 9f70fb2c42c7201c116c543be223a9bba0bfb472

1.0.0

view details

Alessandro Segala (ItalyPaleAle)

commit sha a0364319fea6d691bd1be40a50bb7970ee37aa3b

Added docs

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 26064be3b27b49ccb4fa40422ca6a0e436d15385

Rename package

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 95c23c30221281bdba71174df5ecc4d6022d149b

Publish 0.9.0

view details

Alessandro Segala (ItalyPaleAle)

commit sha 11046a01e01345b9832c61dbe25db043876b4021

Excluding files from NPM

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha 9ee4f0e9512104c1b43357d6551ee301bc52f724

Remove cache for Node.js CI

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro Segala (ItalyPaleAle)

commit sha ea38377c58f7221e6dd28fa93ff066fdaaae4309

It's YAML

view details

Alessandro Segala (ItalyPaleAle)

commit sha 558ec36753141e300427cdcc9f8a5d8ce5e0b3f6

For modules, there's no package-lock.json

view details

push time in 2 days

push eventItalyPaleAle/arraybuffer

Alessandro (Ale) Segala

commit sha 6d217d97be3276e86476eb03d3ea53315a72c714

Create node.js.yaml

view details

push time in 2 days

issue closedItalyPaleAle/svelte-spa-router

How get the previous location, i.e. from where we get ?

import { location } from "svelte-spa-router"; can give you the current location and this is great. I'm find myself wanting to know where I'm coming from. Is there a import { previousLocation } from "svelte-spa-router"; ?

The reason I'm looking for this is because I need to manage Svelte transition from the transition PoV and not from the Component (State) PoV. A transition A -> B would not be the same as A -> C

closed time in 5 days

LeoFlandin

issue commentItalyPaleAle/svelte-spa-router

How get the previous location, i.e. from where we get ?

This can't be done in the router, because the router's state is managed by the browser's history, and JavaScript apps can't access the pages that users visited beforehand. Essentially, if I were to implement this, it wouldn't survive page reloads, and users wouldn't be happy :)

Your best bet is to use a route event (such as routeLoaded) and maintain a stack of pages viewed in memory. This will need to be code that is part of your application.

LeoFlandin

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentdapr/go-sdk

Rationalize PublishEvent* APIs

 package client import ( 	"context" 	"encoding/json"+	"log"  	pb "github.com/dapr/go-sdk/dapr/proto/runtime/v1" 	"github.com/pkg/errors" ) +// PublishEventOption is the type for the functional option.+type PublishEventOption func(*pb.PublishEventRequest)+ // PublishEvent publishes data onto specific pubsub topic.-func (c *GRPCClient) PublishEvent(ctx context.Context, pubsubName, topicName string, data []byte) error {+func (c *GRPCClient) PublishEvent(ctx context.Context, pubsubName, topicName string, data interface{}, opts ...PublishEventOption) error { 	if pubsubName == "" { 		return errors.New("pubsubName name required") 	} 	if topicName == "" { 		return errors.New("topic name required") 	} -	envelop := &pb.PublishEventRequest{+	request := &pb.PublishEventRequest{ 		PubsubName: pubsubName, 		Topic:      topicName,-		Data:       data,+	}+	for _, opt := range opts {+		opt(request)+	}++	if data != nil {+		switch d := data.(type) {+		case []byte:+			request.Data = d+		case string:+			request.Data = []byte(d)+		default:+			var err error+			request.DataContentType = "application/json"+			request.Data, err = json.Marshal(d)

Yes because it needs to be defined before L41. Otherwise, request.Data, err := json.Marshal(d) would throw an error because it can't initialize request.Data

ItalyPaleAle

comment created time in 5 days

issue closedItalyPaleAle/SMCloudStore

direct upload from browser

May i know does storage.putObject allow direct upload from browser or this has to run on server side node.js ?

closed time in 5 days

cometta

issue commentItalyPaleAle/SMCloudStore

direct upload from browser

This library is primarily meant for usage from Node.js. Supporting browsers would require major changes.

cometta

comment created time in 5 days

Pull request review commentdapr/docs

Fix 1755

 spec:   version: v1   metadata:   - name: accountName-    value: <REPLACE-WITH-ACCOUNT-NAME>+    value: "[your_account_name]"   - name: accountKey-    value: <REPLACE-WITH-ACCOUNT-KEY>+    value: "[your_account_key]"   - name: containerName-    value: <REPLACE-WITH-CONTAINER-NAME>+    value: "[your_container_name]" ```  {{% alert title="Warning" color="warning" %}} The above example uses secrets as plain strings. It is recommended to use a secret store for the secrets as described [here]({{< ref component-secrets.md >}}). {{% /alert %}} - ## Spec metadata fields  | Field              | Required | Details | Example | |--------------------|:--------:|---------|---------| | accountName        | Y        | The storage account name | `"mystorageaccount"`.-| accountKey         | Y        | Primary or secondary storage key | `"key"`+| accountKey         | Y (unless using Azure AD) | Primary or secondary storage key | `"key"` | containerName      | Y         | The name of the container to be used for Dapr state. The container will be created for you if it doesn't exist  | `"container"`-| ContentType        | N        | The blob’s content type | `"text/plain"`+| `azureEnvironment` | N | Optional name for the Azure environment if using a different Azure cloud | `"AZUREPUBLICCLOUD"` (default value), `"AZURECHINACLOUD"`, `"AZUREUSGOVERNMENTCLOUD"`, `"AZUREGERMANCLOUD"`+| ContentType        | N        | The blob's content type | `"text/plain"` | ContentMD5         | N        | The blob's MD5 hash | `"vZGKbMRDAnMs4BIwlXaRvQ=="` | ContentEncoding    | N        | The blob's content encoding | `"UTF-8"` | ContentLanguage    | N        | The blob's content language | `"en-us"` | ContentDisposition | N        | The blob's content disposition. Conveys additional information about how to process the response payload | `"attachment"` | CacheControl       | N        | The blob's cache control | `"no-cache"` -## Setup Azure Blobstorage+## Setup Azure Blob Storage  [Follow the instructions](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal) from the Azure documentation on how to create an Azure Storage Account. -If you wish to create a container for Dapr to use, you can do so beforehand. However, Blob Storage state provider will create one for you automatically if it doesn't exist.+If you wish to create a container for Dapr to use, you can do so beforehand. However, the Blob Storage state provider will create one for you automatically if it doesn't exist.  In order to setup Azure Blob Storage as a state store, you will need the following properties:-- **AccountName**: The storage account name. For example: **mystorageaccount**.-- **AccountKey**: Primary or secondary storage key.-- **ContainerName**: The name of the container to be used for Dapr state. The container will be created for you if it doesn't exist.++- **accountName**: The storage account name. For example: **mystorageaccount**.+- **accountKey**: Primary or secondary storage account key.+- **containerName**: The name of the container to be used for Dapr state. The container will be created for you if it doesn't exist.++### Authenticating with Azure AD++This component supports authentication with Azure AD as an alternative to use account keys. Whenever possible, it is reccomended that you use  Azure AD for authentication in production systems, to take advantage of better security, fine-tuned access control, and the ability to use managed identities for apps running on Azure.

I actually had suggested to simplify to "Whenever possible, [you should] use Azure AD to authenticate your Dapr components, for increased security..." to avoid the passive voice

ItalyPaleAle

comment created time in 5 days

PullRequestReviewEvent
MemberEvent

Pull request review commentdapr/docs

Fix 1755

+---+type: docs+title: "Authenticating to Azure"+linkTitle: "Authenticating to Azure"+description: "How to authenticate Azure components using Azure AD and/or Managed Identities"+aliases:+  - "/operations/components/setup-secret-store/supported-secret-stores/azure-keyvault-managed-identity/"+  - "/reference/components-reference/supported-secret-stores/azure-keyvault-managed-identity/"+---++## Common Azure authentication layer++Certain Azure components for Dapr offer support for the *common Azure authentication layer*, which enables applications to access data stored in Azure resources by authenticating with Azure AD. Thanks to this, administrators can leverage all the benefits of fine-tuned permissions with RBAC (Role-Based Access Control), and applications running on certain Azure services such as Azure VMs, Azure Kubernetes Service, or many Azure platform services can leverage [Managed Service Identities (MSI)](https://docs.microsoft.com/azure/active-directory/managed-identities-azure-resources/overview).++Some Azure components offer alternative authentication methods, such as systems based on "master keys" or "shared keys". Whenever possible, we recommend authenticating your Dapr components using Azure AD for increased security and ease of management, as well as for the ability to leverage MSI if your app is running on supported Azure services.

This one can be simplified to "Whenever possible, use Azure AD to authenticate your Dapr components, for increased security..."

ItalyPaleAle

comment created time in 5 days

PullRequestReviewEvent

push eventItalyPaleAle/dapr-docs

Alessandro (Ale) Segala

commit sha 663c3aaa14b31057c3563c2d2a89695db2238c5b

Update daprdocs/content/en/developing-applications/integrations/cloud-providers/authenticating-azure.md Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>

view details

push time in 5 days

push eventItalyPaleAle/dapr-docs

Alessandro (Ale) Segala

commit sha 663cc421842ccbebfa64d4d67c48917f0dfa1c56

Update daprdocs/content/en/developing-applications/integrations/cloud-providers/authenticating-azure.md Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>

view details

push time in 5 days

pull request commentdapr/docs

Fix 1755

I've updated the Azure Blob Storage doc too. PR is complete and ready for your review.

ItalyPaleAle

comment created time in 9 days

push eventItalyPaleAle/dapr-docs

ItalyPaleAle

commit sha 57af58d496f811bffee85636a31a6ad8371b0f67

Updated the Azure Blob Storage state store doc

view details

push time in 9 days