startedhypergraph-xyz/editor
started time in 3 days
startedbwrrp/xspattern.js
started time in 6 days
startedtensorflow/tfjs
started time in 12 days
startedChenYuHo/handwriting.js
started time in 12 days
startedgooglecolab/backend-container
started time in 13 days
PR opened ViktorQvarfordt/unicode-latex
fixes #1
pr created time in 15 days
issue openedViktorQvarfordt/unicode-latex
There is an invalid trailing comma after the last item in each of the JSON files.
created time in 15 days
startedViktorQvarfordt/unicode-latex
started time in 15 days
startedzaidalyafeai/zaidalyafeai.github.io
started time in 16 days
startedgithubharald/SimpleHTR
started time in 16 days
startederikbern/deep-fonts
started time in 16 days
startednextapps-de/flexsearch
started time in 17 days
startedGoogleChromeLabs/react-adaptive-hooks
started time in 19 days
startedopenid/AppAuth-JS
started time in 19 days
push eventhubgit/php7-apache-saxonhe
commit sha 42570d4c15b67931c885958a2d2541f7280c4b78
Upgrade Saxon-HE
push time in 21 days
startedtinacms/tinacms
started time in 24 days
startedallenai/s2-gorc
started time in 24 days
startedzotero/bib-web
started time in a month
startedsinclairzx81/zero
started time in a month
created repositoryJATS4R/authoring
JATS-based interoperability for article authoring tools
created time in a month
startedzotero/translation-server
started time in a month
startedpshihn/rough
started time in a month
push eventhubgit/astrocite
commit sha 7699a4b4e3a6c656fdc0659bc68766420d73db79
Add identifiers to BibTeX field mapping (#16) Add ISBN, ISSN, PMCID and PMID to BibTeX -> CSL field mapping
commit sha 157b5e15cbfa68502eb35539cf5487898953866e
chore: upgrade devDependencies
commit sha c2002abd4eeae0571d515de7430d671466ea5dff
v0.15.4
commit sha 775c73bef08a9d82c438282f61e6acfe8828836d
[eutils] Treat fields that map to "title" and "container-title" as HTML (#19) * Treat fields that map to "title" and "container-title" as HTML * Remove astrocite-core from package-lock.json again
commit sha dbb3faf2e67d4bfdb9873c4e9f1eb3fc633bf571
Merge remote-tracking branch 'upstream/master' into collective-names
push time in a month
Pull request review commentdsifford/astrocite
[eutils] Improve handling of article identifiers
const FIELD_TRANSFORMS = new Map< [ 'articleids', ({ articleids, uid }) => {- return {- URL: articleids.some(({ value }) => value === uid)- ? `https://www.ncbi.nlm.nih.gov/pubmed/${uid}`- : `https://www.ncbi.nlm.nih.gov/pmc/articles/PMC${uid}`,+ const data: Partial<Data> = {};++ const chooseURL = () => {+ const pmcid = articleids.find(({ idtype, value }) => {+ return idtype === 'pmcid' && value === `PMC${uid}`;+ });++ if (pmcid) {+ return `https://www.ncbi.nlm.nih.gov/pmc/articles/${pmcid.value}`;+ }++ const pmid = articleids.find(({ idtype, value }) => {+ return idtype === 'pubmed' && value === uid;+ });++ if (pmid) {+ return `https://www.ncbi.nlm.nih.gov/pubmed/${pmid.value}`;+ }++ return undefined; };++ const url = chooseURL();++ if (url) {+ data.URL = url;+ }
db=pubmed
: https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id=28507505&retmode=jsondb=pmc
: https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pmc&id=5410569&retmode=json
comment created time in a month
push eventhubgit/astrocite
commit sha b736303af06eb0abadaee0f45e0c42563ddff673
Use undefined instead of null
push time in a month
Pull request review commentdsifford/astrocite
[eutils] Convert collective author names to literal names
const FIELD_TRANSFORMS = new Map< ({ authors }) => { return { author: authors- .map(({ name }) => {- const [family, given] = name.split(' ');- return {- family: family || '',- ...(given ? { given } : {}),- };+ .map(({ authtype, name }) => {+ switch (authtype) {+ case 'CollectiveName':+ if (!name) {+ return null;+ }++ return {+ literal: name,+ };++ case 'Author':+ default:+ const [family, given] = name.split(' ');++ if (!family) {+ return null;+ }++ return {+ family,+ ...(given ? { given } : {}),+ };+ } })- .filter(({ family }) => family !== ''),+ .filter(item => item !== null) as Person[],
Sounds good - updated to use undefined
.
comment created time in a month
Pull request review commentdsifford/astrocite
[eutils] Improve handling of article identifiers
const FIELD_TRANSFORMS = new Map< [ 'articleids', ({ articleids, uid }) => {- return {- URL: articleids.some(({ value }) => value === uid)- ? `https://www.ncbi.nlm.nih.gov/pubmed/${uid}`- : `https://www.ncbi.nlm.nih.gov/pmc/articles/PMC${uid}`,+ const data: Partial<Data> = {};++ const chooseURL = () => {+ const pmcid = articleids.find(({ idtype, value }) => {+ return idtype === 'pmcid' && value === `PMC${uid}`;+ });++ if (pmcid) {+ return `https://www.ncbi.nlm.nih.gov/pmc/articles/${pmcid.value}`;+ }++ const pmid = articleids.find(({ idtype, value }) => {+ return idtype === 'pubmed' && value === uid;+ });++ if (pmid) {+ return `https://www.ncbi.nlm.nih.gov/pubmed/${pmid.value}`;+ }++ return undefined; };++ const url = chooseURL();++ if (url) {+ data.URL = url;+ }
Updated to process the identifiers in a single loop.
comment created time in a month
push eventhubgit/astrocite
commit sha 3b1ea12a618817d74778c4727e14d757076f9338
Use a single loop
push time in a month
Pull request review commentdsifford/astrocite
[eutils] Improve handling of article identifiers
const FIELD_TRANSFORMS = new Map< [ 'articleids', ({ articleids, uid }) => {- return {- URL: articleids.some(({ value }) => value === uid)- ? `https://www.ncbi.nlm.nih.gov/pubmed/${uid}`- : `https://www.ncbi.nlm.nih.gov/pmc/articles/PMC${uid}`,+ const data: Partial<Data> = {};++ const chooseURL = () => {+ const pmcid = articleids.find(({ idtype, value }) => {+ return idtype === 'pmcid' && value === `PMC${uid}`;+ });++ if (pmcid) {+ return `https://www.ncbi.nlm.nih.gov/pmc/articles/${pmcid.value}`;+ }++ const pmid = articleids.find(({ idtype, value }) => {+ return idtype === 'pubmed' && value === uid;+ });++ if (pmid) {+ return `https://www.ncbi.nlm.nih.gov/pubmed/${pmid.value}`;+ }++ return undefined; };++ const url = chooseURL();++ if (url) {+ data.URL = url;+ }
The code in the PR is more explicit, by comparing those values to what's expected based on the uid.
Instead of looping through several times, it could perhaps build an object containing the articleid keys and values in a single step.
comment created time in a month
Pull request review commentdsifford/astrocite
[eutils] Improve handling of article identifiers
const FIELD_TRANSFORMS = new Map< [ 'articleids', ({ articleids, uid }) => {- return {- URL: articleids.some(({ value }) => value === uid)- ? `https://www.ncbi.nlm.nih.gov/pubmed/${uid}`- : `https://www.ncbi.nlm.nih.gov/pmc/articles/PMC${uid}`,+ const data: Partial<Data> = {};++ const chooseURL = () => {+ const pmcid = articleids.find(({ idtype, value }) => {+ return idtype === 'pmcid' && value === `PMC${uid}`;+ });++ if (pmcid) {+ return `https://www.ncbi.nlm.nih.gov/pmc/articles/${pmcid.value}`;+ }++ const pmid = articleids.find(({ idtype, value }) => {+ return idtype === 'pubmed' && value === uid;+ });++ if (pmid) {+ return `https://www.ncbi.nlm.nih.gov/pubmed/${pmid.value}`;+ }++ return undefined; };++ const url = chooseURL();++ if (url) {+ data.URL = url;+ }
It's tricky, because the pmcid
, pmc
, pmid
and pubmed
articleid values are different in eSummary responses when the db is either "pubmed" or "pmc".
I've had another look, and it might be safe to say that if there's a "pubmed" articleid (only present in "pubmed" responses) then use that, otherwise if there's a "pmcid" articleid (present in both responses, but only with a useful value in "pmc" responses), then use that.
comment created time in a month
PR opened dsifford/astrocite
As well as adding a DOI
value from the articleids
array, this improves handling of PMID
and PMCID
values and URL generation from those IDs.
Fixes #21
pr created time in a month
push eventhubgit/astrocite
commit sha 81b10808e44b2e63cf5062f085d8f2448bb592c7
Combine pmc and pmcid handling
push time in a month
push eventhubgit/astrocite
commit sha c1d4a705e15fe2daa19b08d5d61896c650c428b3
Refactor
push time in a month
issue openeddsifford/astrocite
[eutils] Use DOI from article identifiers
Because a DOI is often used to identify a citation, it would be useful to have the DOI included in the data parsed from the eSummary results.
Code
const fetch = require('node-fetch')
const { toCSL } = require('astrocite-eutils')
fetch('https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id=28507505&retmode=json')
.then(response => response.json())
.then(toCSL)
.then(items => items[0].DOI)
Expected behavior:
Output: 10.3389/fnmol.2017.00112
Actual behavior:
Output: undefined
created time in a month
PR opened dsifford/astrocite
When transforming the authors
array, if the author's authtype
equals CollectiveName
, return just a literal name string.
Fixes #18
pr created time in a month
pull request commentdsifford/astrocite
[eutils] Treat fields that map to "title" and "container-title" as HTML
Using eFetch XML rather than eSummary JSON as the source data would be the ideal solution, but that would be a bigger change so I haven't attempted it here.
comment created time in a month
PR opened dsifford/astrocite
This treats eSummary entry fields that were mapped to CSL "title" or "container-title" fields as HTML, and runs them through the decode
method of he
to decode any HTML entities.
fixes #17
pr created time in a month
push eventhubgit/astrocite
commit sha 5f015f70bf2524629fcd14b958d80a356095564a
Remove astrocite-core from package-lock.json again
push time in a month
issue openedORCID/ORCID-Source
Add "authorization" to allowed CORS headers for oauth/userinfo requests
I'm using the oidc-client
library to authenticate users, and after successfully authenticating a user with ORCID it makes a cross-domain request to fetch the user's profile information.
Unfortunately an OPTIONS request to https://sandbox.orcid.org/oauth/userinfo fails: the request has the Access-Control-Request-Headers: authorization
header but the response has only access-control-allow-headers: X-Requested-With,Origin,Content-Type,Accept,x-csrf-token
- authorization
is missing from the allowed headers.
Would it be possible to add authorization
to the list of allowed headers?
created time in a month
push eventhubgit/php7-apache-saxonhe
commit sha 9b2e4ac589ca11ca99aca444021b4027b42a8bfd
Remove patches and upgrade to Saxon/C v1.2.0
commit sha 88af11c5f2a651090a659c43b4c32fe504c66788
Use a less specific PHP Docker image
push time in a month
issue openeddsifford/astrocite
[eutils] Convert collective author names to literal names
When an author has authType
"CollectiveName", the name should be treated as a literal string.
Code
const fetch = require('node-fetch')
const { toCSL } = require('astrocite-eutils')
fetch('https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id=21255862&retmode=json')
.then(response => response.json())
.then(toCSL)
.then(items => items[0].author)
"authors": [
{
"name": "Lister AM",
"authtype": "Author",
},
{
"name": "Climate Change Research Group.",
"authtype": "CollectiveName",
}
],
Expected behavior:
[Object {family: "Lister", given: "AM"}, Object {literal: "Climate Change Research Group."}]
Actual behavior:
[Object {family: "Lister", given: "AM"}, Object {family: "Climate", given: "Change"}]
created time in a month
issue commentdsifford/astrocite
[eutils] Decode HTML entities in eSummary response fields
Here's an example of a journal title containing an ampersand:
"fulljournalname": "Trends in ecology & evolution"
comment created time in a month
issue commentdsifford/astrocite
[eutils] Decode HTML entities in eSummary response fields
According to the PubMed DTD, markup can be b
, i
, sup
, sub
and u
.
<!ENTITY % text "#PCDATA | b | i | sup | sub | u" >
comment created time in a month
issue openeddsifford/astrocite
[eutils] Decode HTML entities in eSummary response fields
It seems that string fields in JSON returned from the eSummary API contain markup that's HTML-encoded.
e.g. https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id=28507505&retmode=json
This is unexpected but I imagine unlikely to change, unless it's a new bug, so fields like title
and containerTitle
need to go through a step where named HTML entities are converted to Unicode characters.
Code
const fetch = require('node-fetch')
const { toCSL } = require('astrocite-eutils')
fetch('https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id=28507505&retmode=json')
.then(response => response.json())
.then(toCSL)
.then(items => items[0].title)
Expected behavior:
The article title contains HTML markup.
Protein-Protein Interaction Among the FoxP Family Members and their Regulation of Two Target Genes, <i>VLDLR</i> and <i>CNTNAP2</i> in the Zebra Finch Song System.
Actual behavior:
The article title contains HTML-encoded markup.
Protein-Protein Interaction Among the FoxP Family Members and their Regulation of Two Target Genes, <i>VLDLR</i> and <i>CNTNAP2</i> in the Zebra Finch Song System.
created time in a month
delete branch hubgit/react-app-template
delete branch : dependabot/npm_and_yarn/eslint-utils-1.4.2
delete time in 2 months
PR closed hubgit/react-app-template
⚠️ Dependabot is rebasing this PR ⚠️
If you make any changes to it yourself then they will take precedence over the rebase.
Bumps eslint-utils from 1.4.0 to 1.4.2. <details> <summary>Commits</summary>
4e1bc07
1.4.2e4cb014
🐛 add null test230a4e2
1.4.108158db
🐛 fix getStaticValue security issue587cca2
🐛 fix getStringIfConstant to handle literals correctlyc119e83
🐛 fix getStaticValue to handle bigint correctly- See full diff in compare view </details> <br />
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase
.
<details> <summary>Dependabot commands and options</summary> <br />
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot ignore this [patch|minor|major] version
will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)@dependabot use these labels
will set the current labels as the default for future PRs for this repo and language@dependabot use these reviewers
will set the current reviewers as the default for future PRs for this repo and language@dependabot use these assignees
will set the current assignees as the default for future PRs for this repo and language@dependabot use this milestone
will set the current milestone as the default for future PRs for this repo and language
You can disable automated security fix PRs for this repo from the Security Alerts page.
</details>
pr closed time in 2 months
push eventhubgit/react-app-template
commit sha f3e887466aed0a19e11d24a8df54dba91fc8cb0e
Rename folder + index file
commit sha 1894bcc2b5ed6aae104f435855d45e440e0e6bb4
Add .json to module extensions
commit sha c7cea1fd426cf13f285a542c109d03190da98ae3
Delete now.json
commit sha 73ae354b0b46564fe13067d17fbfe40bac716923
Delete .nowignore
commit sha 84366f69fdcafd0d4f3ff602be439f4049bb2270
Update README.md
commit sha ba85955b7e2d363f60b85dfacd4a7a80a2f1335f
Upgrade dependencies
push time in 2 months
issue closedretorquere/bibtex-parser
Is a mapping available anywhere from this tool's output format to the standard fields of CSL JSON?
closed time in 2 months
hubgitissue commentretorquere/bibtex-parser
I see, thanks for the explanation. I'd added both astrocite and this library to a comparison and hadn't spotted that both were using the same parser.
If there's any difference in the final output I might still look at a map to CSL, partly to ease comparison.
comment created time in 2 months
push eventhubgit/citation-parsers
commit sha b88cbb3bd34fe5e125644d608f5b0ddbe7112dc7
Add bibtex
commit sha 8d201c5c3963eb12289e5a42e1ef30837254b883
Remove bibtex
push time in 2 months
startedsvenkreiss/unicodeit
started time in 2 months
issue openedretorquere/bibtex-parser
Is a mapping available anywhere from this tool's output format to the standard fields of CSL JSON?
created time in 2 months
push eventhubgit/citation-parsers
commit sha e68cd0cc929ae8bc32cfb4cbadd938f7bfcd60a6
Update App.tsx
push time in 2 months
push eventhubgit/citation-parsers
commit sha 4638649a46850877a154e6b3de709880d81c2ec8
Refactor, remove some commands from abstract
push time in 2 months
push eventhubgit/citation-parsers
commit sha a690edc5679537c86dd3c1cc08f6d2efdcad75eb
Remove bibtex-parse-js
commit sha 2e9cffa8a8255577c04ac3e3b8fe3afaecbd6604
Restore bibtex-parse-js
push time in 2 months
push eventhubgit/citation-parsers
commit sha 1659c7f962ab44c47c2319c5fa8131e88ee4ec48
Add bibtex-parser
push time in 2 months
push eventhubgit/citation-parsers
commit sha d7c6e7ab1c8bec145740e1bb9a97aa3d757fe8a2
Add links, fix CSL output
push time in 2 months
push eventhubgit/citation-parsers
commit sha 80a59968224eb25f3ac53e92ac27926e89fa05e3
Revert name
push time in 2 months
push eventhubgit/citation-parsers
commit sha 476784bf00d1f99e5f14df8d0c62c86814e986cf
Update App.tsx
push time in 2 months
push eventhubgit/citation-parsers
commit sha 6234b06b01ce313550b3908b868fae8ea87e670f
Change name
push time in 2 months
push eventhubgit/citation-parsers
commit sha a9eb7d4751f0c1fd4650e0ec6913b79e6f78e1d5
Initial commit
push time in 2 months
pull request commentORCID/bibtexParseJs
Move test framework to deveDependencies
Please merge this - the test framework shouldn't be part of the dependencies when installing this package.
comment created time in 2 months
created repositoryhubgit/citation-parsers
Comparison of citation parsing tools
created time in 2 months
startedPezmc/BibLaTeX-Linter
started time in 2 months
startedFlamingTempura/bibtex-tidy
started time in 2 months
startedplk/biber
started time in 2 months
startedbrefphp/bref
started time in 2 months
push eventhubgit/browser-actions
commit sha 97eb2073f628252d2b652a3a7c412ef5e19ed919
Update "Download CRX" action
push time in 2 months
startedvladignatyev/crx-extractor
started time in 2 months
PR opened dsifford/astrocite
Add ISBN, ISSN, PMCID and PMID to BibTeX -> CSL field mapping
[x] There is an associated issue.
[x] Code is up-to-date with the master
branch
[x] You've successfully run npm test
locally
[x] There are new or updated unit tests validating the change
refs #12
pr created time in 2 months
Pull request review commentdsifford/astrocite
export const FIELD_MAP: ReadonlyMap<string, keyof Data> = new Map([ ['series', 'collection-number'], ['title', 'title'], ['volume', 'volume'],+ ['url', 'URL'],+ ['doi', 'DOI'],
It would be useful for someone to either look through the list of CSL fields and see which other fields might need to be added here (I can think of at least PMID and ISSN), or allow this Map to be extended by a parser option.
comment created time in 2 months
issue commentdsifford/astrocite
Non-standard fields ignored by the BibTeX parser
https://www.acm.org/publications/authors/bibtex-formatting has an example BibTeX file containing these extra fields:
@article{Abril:2007:PHD:1188913.1188915,
author = {Abril, Patricia S. and Plant, Robert},
title = {The Patent Holder's Dilemma: Buy, Sell, or Troll?},
journal = {Commun. ACM},
issue_date = {January 2007},
volume = {50},
number = {1},
month = jan,
year = {2007},
issn = {0001-0782},
pages = {36--44},
numpages = {9},
url = {https://doi.org/10.1145/1188913.1188915},
doi = {10.1145/1188913.1188915},
acmid = {1188915},
publisher = {ACM},
address = {New York, NY, USA},
}
comment created time in 2 months
issue openeddsifford/astrocite
Non-standard fields ignored by the BibTeX parser
When parsing a BibTeX item, only the standard BibTeX fields (those listed in the FIELD_MAP
Map) are preserved. This loses useful information such as the DOI, ISSN and URL.
Would it be possible to either add an option to include non-standard fields in the mapping or allow the FIELD_MAP
to be extended as a parse option?
url
and doi
are both mentioned in the verbatimProperties
array so are already parsed correctly, but are then dropped during the field mapping.
Code
const {
parse
} = require('astrocite-bibtex');
const data = `@article{Verkhratsky2019Evolution,
journal = {Advances in experimental medicine and biology},
doi = {10.1007/978-981-13-9913-8_2},
url = {https://link.springer.com/chapter/10.1007%2F978-981-13-9913-8_2},
issn = {0065-2598},
pmid = {31583583},
address = {United States},
title = {Evolution of Neuroglia},
volume = {1175},
author = {Verkhratsky, Alexei and Ho, Margaret S and Parpura, Vladimir},
pages = {15--44},
date = {2019},
year = {2019},
}`;
const item = parse(data);
console.log(item);
Expected behavior:
[
{
id: 'Verkhratsky2019Evolution',
type: 'article',
'container-title': 'Advances in experimental medicine and biology',
'publisher-place': 'United States',
title: 'Evolution of Neuroglia',
volume: '1175',
author: [ [Object], [Object], [Object] ],
page: '15–44',
issued: { 'date-parts': [Array] },
URL: 'https://link.springer.com/chapter/10.1007%2F978-981-13-9913-8_2',
DOI: '10.1007/978-981-13-9913-8_2',
ISSN: '0065-2598',
PMID: '31583583'
}
]
Actual behavior:
[
{
id: 'Verkhratsky2019Evolution',
type: 'article',
'container-title': 'Advances in experimental medicine and biology',
'publisher-place': 'United States',
title: 'Evolution of Neuroglia',
volume: '1175',
author: [ [Object], [Object], [Object] ],
page: '15–44',
issued: { 'date-parts': [Array] }
}
]```
created time in 2 months
issue commentdsifford/astrocite
Error thrown when a newline is encountered within an RIS field
If you're able to find documentation negating that, I'd be happy to look into adjusting the parser.
I found this in the RIS specification:
How to handle long fields
If the information following any one tag is more than 70 characters long, it is allowable (though not necessary) to insert a carriage return/line feed at the end of 70 characters, and continue on the next line.
comment created time in 2 months
issue commentdsifford/astrocite
Error thrown when a newline is encountered within an RIS field
Did this file get generated from some other application?
I'm testing with BibTeX files generated by https://lens.org but there are some other issues with abstracts (e.g. XML markup) in the files that suggest it's still a work-in-progress.
comment created time in 2 months
issue commentdsifford/astrocite
Error thrown when a newline is encountered within an RIS field
Feel free to close this if newlines in RIS fields aren't valid - it feels like that would be reasonable, as otherwise the contents of the text could accidentally start a new field.
comment created time in 2 months
issue openeddsifford/astrocite
Error thrown when a newline is encountered within an RIS field
I'm not 100% sure whether newlines are valid within RIS fields, and it's not too hard to run a filter to remove them before passing the data to astrocite
if needed, but I ran into this issue and thought it might be something that should be handled by the parser.
Code
import { parse } from 'astrocite-ris'
const item = parse(`
TY - JOUR
AB - Brazilian Amazon forests contain a large stock of carbon that could be released into the atmosphere as a result of land use and cover change. To quantify the carbon stocks, Brazil has forest inventory plots from different sources, but they are unstandardized and not always available to the scientific community. Considering the Brazilian Amazon extension, the use of remote sensing, combined with forest inventory plots, is one of the best options to estimate forest aboveground biomass (AGB). Nevertheless, the combination of limited forest inventory data and different remote sensing products has resulted in significant differences in the spatial distribution of AGB estimates. This study evaluates the spatial coverage of AGB data (forest inventory plots, AGB maps and remote sensing products) in undisturbed forests in the Brazilian Amazon. Additionally, we analyze the interconnection between these data and AGB stakeholders producing the information. Specifically, we provide the first benchmark of the existing field plots in terms of their size, frequency, and spatial distribution.
We synthesized the coverage of forest inventory plots, AGB maps and airborne light detection and ranging (LiDAR) transects of the Brazilian Amazon. Although several extensive forest inventories have been implemented, these AGB data cover a small fraction of this region (e.g., central Amazon remains largely uncovered). Although the use of new technology such as airborne LiDAR cover a significant extension of AGB surveys, these data and forest plots represent only 1% of the entire forest area of the Brazilian Amazon.
Considering that several institutions involved in forest inventories of the Brazilian Amazon have different goals, protocols, and time frames for forest surveys, forest inventory data of the Brazilian Amazon remain unstandardized. Research funding agencies have a very important role in establishing a clear sharing policy to make data free and open as well as in harmonizing the collection procedure. Nevertheless, the use of old and new forest inventory plots combined with airborne LiDAR data and satellite images will likely reduce the uncertainty of the AGB distribution of the Brazilian Amazon.
AU - Tejada, Graciela
AU - Görgens, Eric Bastos
AU - Espírito-Santo, Fernando Del Bon
AU - Cantinho, Roberta Zecchini
AU - Ometto, Jean Pierre
CY - United Kingdom
DA - 2019/09/03
DO - 10.1186/s13021-019-0126-8
ID - 035-899-051-865-265
IS - 1
JF - Carbon balance and management
KW - Aboveground biomass
KW - Amazon
KW - Carbon
KW - REDD+
KW - Remote sensing
KW - Tropical rain forest
PB - BioMed Central
PY - 2019
SN - 17500680
SP - 11
TI - Evaluating spatial coverage of data on the aboveground biomass in undisturbed forests in the Brazilian Amazon.
UR - https://lens.org/035-899-051-865-265
VL - 14
ER -
`)
Expected behavior:
A parsed item, with an abstract containing newlines.
Actual behavior:
SyntaxError: Expected "ER", "\n", "\r\n", or [A-Z0-9] but "t" found.
created time in 2 months
startedwasdk/WebAssemblyStudio
started time in 2 months
issue commenthubgit/react-prosemirror
[Question] Consider configurable debounce time?
Actually I just noticed that only HtmlEditor
uses the debounce, so maybe if you used Editor
directly and called parse
and serialize
in your code that would solve the problem?
comment created time in 2 months
startedElaniobro/slack-emojis
started time in 2 months
push eventhubgit/zipadee
commit sha f0ebfd4cfa55d2d14f5f6e969d63542fd46d7c20
Update .nowignore
push time in 2 months
push eventhubgit/zipadee
commit sha e5049532a6022fa80ef155bff8d4f996ee5187f6
Update App.css
push time in 2 months
push eventhubgit/zipadee
commit sha 0b1cca8c97a531969e3366dae13133c35cd60014
Update App.css
push time in 2 months
startedSonarSource/eslint-plugin-sonarjs
started time in 2 months
PR opened octet-stream/form-data
The value set for __defaultContentType
is meant to be application/octet-stream
.
pr created time in 2 months
push eventhubgit/form-data
commit sha 51e01a7175f5b558f47ea84574728d69c31d6d90
Fix typo in __defaultContentType value
push time in 2 months
fork hubgit/form-data
FormData implementation for Node.js. Built over Readable stream and async generators.
https://www.npmjs.com/package/formdata-node
fork in 2 months
startedsuperseb/cert-check
started time in 2 months
PR merged JATS4R/jats-validator
Bumps @jats4r/dtds from 0.0.4 to 0.0.5. <details> <summary>Commits</summary>
189d89c
0.0.51ed0adf
Remove the doctype from the catalog file- See full diff in compare view </details> <br />
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase
.
<details> <summary>Dependabot commands and options</summary> <br />
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)@dependabot use these labels
will set the current labels as the default for future PRs for this repo and language@dependabot use these reviewers
will set the current reviewers as the default for future PRs for this repo and language@dependabot use these assignees
will set the current assignees as the default for future PRs for this repo and language@dependabot use this milestone
will set the current milestone as the default for future PRs for this repo and language@dependabot badge me
will comment on this PR with code to add a "Dependabot enabled" badge to your readme
Additionally, you can set the following in your Dependabot dashboard:
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
Finally, you can contact us by mentioning @dependabot.
</details>
pr closed time in 2 months
push eventJATS4R/jats-validator
commit sha f3448438db36d34a66354380927d97792c321671
Bump @jats4r/dtds from 0.0.4 to 0.0.5 (#23) Bumps [@jats4r/dtds](https://github.com/JATS4R/jats-dtds) from 0.0.4 to 0.0.5. - [Release notes](https://github.com/JATS4R/jats-dtds/releases) - [Commits](https://github.com/JATS4R/jats-dtds/compare/v0.0.4...v0.0.5) Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
push time in 2 months