profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/raajheshkannaa/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Raajhesh Kannaa Chidambaram raajheshkannaa

raajheshkannaa/amazon-sagemaker-examples 0

Example notebooks that show how to apply machine learning, deep learning and reinforcement learning in Amazon SageMaker

raajheshkannaa/awesome-appsec 0

A curated list of resources for learning about application security

raajheshkannaa/awesome-cheatsheet 0

:beers: awesome cheatsheet

raajheshkannaa/awesome-pentest 0

A collection of awesome penetration testing resources, tools and other shiny things

raajheshkannaa/awesome-web-hacking 0

A list of web application security

raajheshkannaa/aws-secure-access-infrastructure-identity-workshop 0

This workshop is designed to help you get familiar with AWS Security services and learn how to use them to securely administer systems in your environment. You will be working with services such as AWS Systems Manager Session Manager, Amazon EC2 Instance Connect, and AWS Identity and Access Management. You will learn how to use these services to securely connect and administer your Amazon EC2 Instances as well as systems on-premise, you will setup tagged based access, and configure logging which will enable auditing of administrative activities and improve the security posture of your environment.

raajheshkannaa/glacierBackup 0

A simple script to backup a folder (or folders) to Amazon Glacier

raajheshkannaa/linux-insides 0

A little bit about a linux kernel

raajheshkannaa/og-aws 0

📙 Amazon Web Services — a practical guide

raajheshkannaa/Potato 0

If you fork this then you can say you forked a potato.

startedageron/handson-ml2

started time in 15 days

startedjosephofiowa/missing-data-workshop-odsc19

started time in 15 days

startedamueller/COMS4995-s20

started time in 15 days

issue commentcobbzilla/s3s3mirror

account to account copy

I am also wondering whether this tool supports copy between two S3 compatible APIs, where one of them is outside AWS, e.g. Backblaze.

krishnachaitanya-olx

comment created time in 22 days

release ytdl-org/youtube-dl

2021.04.17

released time in 22 days

startedantoniomika/sish

started time in a month

created repositorymubix/securitytitles.com

Standardizing Security Titles

created time in a month

startedliamg/traitor

started time in 2 months

created repositorymubix/mubix

created time in 2 months

startedoptiv/ScareCrow

started time in 2 months

startedFortyNorthSecurity/EXCELntDonut

started time in 2 months

startedBishopFox/sliver

started time in 2 months

startednettitude/PoshC2

started time in 2 months

startedfox-it/BloodHound.py

started time in 2 months

fork mubix/scoringengine

Scoring Engine for Red/White/Blue Team Competitions

fork in 3 months

issue openedcobbzilla/s3s3mirror

account to account copy

Hello,

What is the syntax to copy objects between different accounts and different regions?

Thanks, Krishna

created time in 3 months

created repositorymubix/manage2decrypt

ManageEngine OpManager Decryption Tools

created time in 3 months

issue commentcobbzilla/s3s3mirror

Cannot copy from "subfolder" (prefix) to "root-folder" (without prefix) from src-bucket to dest-bucket

Hi @cobbzilla ,

yes I tried that. I used a new git-clone of the repo, please find commands and test-results below. I might be doing something wrong - but do not know what :)

(Sidenote: it seems the log4j.xml is missing from "/target/classes/log4j.xml" in the repo - might me my miss as well, though).

Thanks for this cool and very helpful tool! And thanks for helping to support it!

Thanks and cheers - Robert

[12:15 wunsch@wunsch-WX-3 s3s3mirror] (master) > aws s3 ls s3://mdb-test-source --recursive
2021-02-09 10:21:22 1073741824 datastore/test123

[12:15 wunsch@wunsch-WX-3 s3s3mirror] (master) > aws s3 ls s3://mdb-test-target --recursive

[12:16 wunsch@wunsch-WX-3 s3s3mirror] (master) > ./s3s3mirror.sh -v s3://mdb-test-source/datastore s3://mdb-test-target
log4j:ERROR Could not parse url [file:target/classes/log4j.xml].
java.io.FileNotFoundException: target/classes/log4j.xml (No such file or directory)
        at java.io.FileInputStream.open0(Native Method)
        at java.io.FileInputStream.open(FileInputStream.java:195)
        at java.io.FileInputStream.<init>(FileInputStream.java:138)
        at java.io.FileInputStream.<init>(FileInputStream.java:93)
        at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
        at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
        at org.apache.log4j.xml.DOMConfigurator$2.parse(DOMConfigurator.java:765)
        at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:871)
        at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:778)
        at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
        at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
        at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
        at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
        at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
        at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
        at org.cobbzilla.s3s3mirror.MirrorMain.<clinit>(MirrorMain.java:23)
log4j:WARN No appenders could be found for logger (com.amazonaws.AmazonWebServiceClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

[12:18 wunsch@wunsch-WX-3 s3s3mirror] (master) > aws s3 ls s3://mdb-test-target --recursive
2021-02-10 12:17:40 1073741824 datastore/test123
rwunsch

comment created time in 3 months

PR opened cobbzilla/s3s3mirror

When Using Prefixes With Deletion Copies Objects are Deleted When They Shouldn't Be

As I have started to use additional features of the library I have noticed that if I include prefixes along with the --delete-removed I noticed that the objects would be deleted before they were copied over again. This of course defeats much of the purpose of doing the sync as well as inflates the site of the bucket when versioning is enabled. Digging in it turned outto be small issues that are represented in this PR to fix the issue.

+10 -6

0 comment

5 changed files

pr created time in 3 months

issue commentcobbzilla/s3s3mirror

Cannot copy from "subfolder" (prefix) to "root-folder" (without prefix) from src-bucket to dest-bucket

Did you try: s3s3mirror src-bucket/datastore target-bucket ?

rwunsch

comment created time in 3 months

issue openedcobbzilla/s3s3mirror

Cannot copy from "subfolder" (prefix) to "root-folder" (without prefix) from src-bucket to dest-bucket

We are tying to copy an src-s3-bucket with a "prefix" (subfolder), to a des-s3-bucket without a "prefix" (subfolder)

Example: in the sec-bucket we have a subfolder "datastore": s3://src-bucket/datastore We would like to "copy" everything from the "s3://src-bucket/datastore" into "s3://target-bucket/datastore

Options we tried:

s3s3mirror -p "datastore" s3://src-bucket s3://target-bucket -> copies "datastore-folder" s3s3mirror -p "datastore" -d s3://src-bucket s3://target-bucket > fails with error s3s3mirror -p "datastore" -d "" s3://src-bucket s3://target-bucket -> copies "datastore-folder" s3s3mirror -p "datastore" -d " " s3://src-bucket s3://target-bucket -> creates " "-folder s3s3mirror -p "datastore" -d "." s3://src-bucket s3://target-bucket -> creates "."-folder s3s3mirror -p "datastore" -d . s3://src-bucket s3://target-bucket -> creates "."-folder s3s3mirror -p "datastore" -d / s3://src-bucket s3://target-bucket -> copies into "/"folder

How would one specify "using the s3-root" in the "-d" parameter?

created time in 3 months