profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/vict0rsch/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Victor Schmidt vict0rsch @mila-iqia Montréal, Canada https://vict0rs.ch PhD Student @Mila-iqia with Yoshua Bengio | ex @entrepreneur-interet-general

vict0rsch/deep_learning 418

Deep Learning Resources and Tutorials using Keras and Lasagne

uclnlp/jack 246

Jack the Reader

mlco2/impact 59

ML has an impact on the climate. But not all models are born equal. Compute your model's emissions with our calculator and add the results to your paper with our generated latex template

datafornews/metada 34

A browser extension to show you who owns the media you read

mila-iqia/COVI-AgentSim 12

Covid-19 spread simulator with human mobility and intervention modeling.

vict0rsch/ArxivTools 12

A browser extension that enhances Arxiv: BibTex citation, Markdown link, direct download and more!

mila-iqia/COVI-ML 8

Risk model training code for Covid-19 tracing application.

pg2455/covid_p2p_simulation 6

Simulator for COVID-19 spread

vict0rsch/data_science_polytechnique 5

Data Science introduction from the Ecole polytechnique

push eventvict0rsch/ArxivTools

vict0rsch

commit sha cd3a900af7a21450d1d5ac3e9b24765c84af86c0

refactor

view details

vict0rsch

commit sha 9fe0194230321d922e711acbdfcb34966520038b

refactor -> minify all

view details

push time in 9 hours

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 4ecbf9def0a56260ace7b540bf83bad633ed84a7

fix ids

view details

push time in 9 hours

push eventvict0rsch/ArxivTools

vict0rsch

commit sha cc62a549d33f390ac93fd61d9e6d44ff00a49241

fix bugs

view details

push time in 18 hours

push eventmlco2/codecarbon

vict0rsch

commit sha 25704dd189c247f6e899a00d3e8cdfed971e5314

move gpu logs down

view details

push time in 2 days

push eventmlco2/codecarbon

vict0rsch

commit sha b38cfdfc7817bf4003951bd9383f7407077296aa

add ram unit

view details

push time in 2 days

push eventmlco2/codecarbon

vict0rsch

commit sha 9c9efdb600c0181ca608abb837b19ee489e7402e

improve metadata logs

view details

push time in 2 days

push eventmlco2/codecarbon

vict0rsch

commit sha b4476802b7b2048ae9803ebe05d336e4aeadc44d

add ram metadata

view details

vict0rsch

commit sha 103f728469a0e528016f9e90e214b987cbd7e494

improve ram detection

view details

vict0rsch

commit sha 6611c438a58d903dab75a05d95b02d2ac4593ab9

add count_cpus() function

view details

vict0rsch

commit sha f3bbb6c55182e40e6d4bda54a3e3982d1580f4a9

refactor logs

view details

push time in 2 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 7808de4352f855c93880d4dba8f7e79b4c453da2

fix title update

view details

push time in 2 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha f208c7ce0a5d9cf068deb9b049b8f70fe2ac5183

fix focus and id management

view details

push time in 2 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha f3cdcb9cfe6d41586ed0a42696265c0c33ec84e0

add tabs and improve keyboard nav

view details

push time in 2 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 39b6e0011e3175b3ee1577434a31ff9e1eee4318

tiny fix

view details

push time in 3 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha a5937323d2f271a6f3f56b6542e378817a1981ab

clean up makeMemoryTable

view details

push time in 3 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 49d491d2a79dc91d54f5fae5ef6aa2b598a14ad1

update readme

view details

push time in 3 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha a47834dbd702669458b447efb76443f6e2a46c78

handle multiple words

view details

vict0rsch

commit sha 8d2b86de194c5f850e2ecf8f2f74878399a83dd8

update readme

view details

push time in 3 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 0eaa7fdb76efc13a386312a09e68bb3c4e69e417

edit tagline

view details

push time in 3 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha baebaaeb431c06f09ebed945db016fdbc1ad272a

add keyboard navigation (tab, backspace, enter, shortcut)

view details

push time in 3 days

PullRequestReviewEvent

push eventvict0rsch/ArxivTools

vict0rsch

commit sha f7500890c8124dcd91500f8c235cfc7c598f2469

add notes and new sprite and download

view details

push time in 4 days

IssuesEvent

issue closedmlco2/codecarbon

Issue with rapl energy on Linux - value seems wrong by a 100x factor

  • CodeCarbon version: codecarbon==1.0.0
  • Python version: 3.8.3
  • Operating System: Linux ubuntu 20.04 LTS

Description

I've been measuring the energy consumption of a CPU-only inference (7 min long) using several tools :

  • powerAPI / smartwatt : powerapi.org/
  • pyJoule : https://pyjoules.readthedocs.io/en/latest/index.html
  • codecarbon

Codecarbon gives wildly different results : 2.15 Wh for powerAPI, 2.21 Wh for pyJoules and 0.035 Wh for codecarbon. Given the duration of the test and the processor in my test computer, I believe powerAPI and pyJoule to be right and codecarbon to have an issue. I had a quick look on how codecarbon measure energy on linux, in IntelRAPL.get_cpu_details ; if I understand it correctly, you calculate the power of the cpu by reading the energy consumed during 10 milliseconds (and of course dividing it then by 10ms, to get an average power during these 10ms) and then use that value to compute the energy for the tracker interval (15s by default). This seems to lead to under-estimating the global energy usage, there is no reason to be sure that the average power these 10ms is the same than the average power during the 10s interval. Besides, as the rapl counter simply counts joules, it would be easier and more accurate to look at this counter simply at the beginning and and of the 10s interval to get a measure of the energy used by the cpu during that interval.

Is there a reason for this process or have I misunderstood the implementation ?

I have not tried codecarbon on Windows so I can not say if the reported value has the same issue.

closed time in 4 days

PierreRust

fork vict0rsch/meshgraphnets

Rewrite deepmind/meshgraphnets into pytorch

fork in 4 days

push eventmlco2/codecarbon

vict0rsch

commit sha debd82c11ac44605e27483a7b348331c28fd25f4

fix mock args order in tests

view details

push time in 4 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha d168722a7d40a02de5a6b9ebb74d5a839c3b0380

improve UI

view details

push time in 6 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha aa1ca7a6b7b86a8e347e1de03aba7dce35dae9ec

functional memory -- v1

view details

push time in 7 days

push eventvict0rsch/ArxivTools

vict0rsch

commit sha 9ad8e5f36bdcdd47f6189fb8cd5b1dff999763be

add memory -- prototype

view details

push time in 7 days

PullRequestReviewEvent

issue commentwwMark/meshgraphnets

Requirements

Also, do you plan on completely removing the tensorflow dependency?

vict0rsch

comment created time in 9 days

issue commentwwMark/meshgraphnets

Requirements

Ok great to know thanks a lot :)

(may I just suggest you update requirements.txt?)

vict0rsch

comment created time in 9 days

issue openedwwMark/meshgraphnets

Requirements

Hi @wwMark,

I was looking for a Pytorch implementation of the MeshGraphNets and I stumbled upon your repo.

Judging on the commits and the state of requirements.txt I expect this is very much work in progress?

Do you already know what is functional or not (if it's not all functional)?

Thanks for sharing this work,

Victor

created time in 9 days

startedwwMark/meshgraphnets

started time in 9 days