Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When storing remotely, print full url in log msgs #35

Open
patcon opened this issue Nov 6, 2015 · 6 comments
Open

When storing remotely, print full url in log msgs #35

patcon opened this issue Nov 6, 2015 · 6 comments

Comments

@patcon
Copy link

patcon commented Nov 6, 2015

This is what we currently see in the logs:

[2015/11/06 21:02:52][info] Storage::S3 (manual) Started...
[2015/11/06 21:02:52][info] Storing 'mybucket/mypath/my-model/2015.11.06.21.02.51/my-model.tar'...
[2015/11/06 21:02:52][info] Storage::S3 (manual) Finished!
[2015/11/06 21:02:52][info] Storage::Local (manual) Started...
[2015/11/06 21:02:52][info] Storing '/var/mybackups/my-model/2015.11.06.21.02.51/my-model.tar'

Would be nice if the full URL were in the output for remote S3 storage. A coworker ran the backup command I had documented, but didn't know what the URL of the stored file was (https://s3.amazonaws.com/mybucket/mypath/my-model/2015.11.06.21.02.51/my-model.tar), and had to seek guidance.

Thanks!

@tombruijn
Copy link
Member

I think the problem is that by logging the full URL it's assumed that you load it. However, not all S3 files are public. I wouldn't recommend you make your backups public. Currently it only acts as a helper to show the location inside the bucket.

Even so, it's quite easy to add the s3 url to the log. I just don't know if we should want to

diff --git a/lib/backup/storage/s3.rb b/lib/backup/storage/s3.rb
index e71697b..12f92d1 100644
--- a/lib/backup/storage/s3.rb
+++ b/lib/backup/storage/s3.rb
@@ -104,7 +104,7 @@ module Backup
         package.filenames.each do |filename|
           src = File.join(Config.tmp_path, filename)
           dest = File.join(remote_path, filename)
-          Logger.info "Storing '#{ bucket }/#{ dest }'..."
+          Logger.info "Storing 'https://s3.amazonaws.com/#{ bucket }/#{ dest }'..."
           cloud_io.upload(src, dest)
         end
       end

@patcon
Copy link
Author

patcon commented Nov 7, 2015

Hm. Good points. Ours aren't publicly available, but the other tool I use allows the url from an S3 item's "Link" property to be used to restore -- the library handles the authentication and download.

So not having it just adds small first step of finding the full url, for the uninitiated. Anyhow, I thought it might be helpful and of little harm, but I'm also a-ok to close this if you don't feel the same way :)

@patcon
Copy link
Author

patcon commented Nov 7, 2015

But if we were to add this, we should probably get the actual s3_origin, as I think this can be changed via fog_options, if someone were to be using an S3-compliant self-hosted service

@tombruijn
Copy link
Member

I don't know. I can't exactly add it easily.. What are the options one would set? The fog aws docs tell me nothing.

@patcon
Copy link
Author

patcon commented Nov 11, 2015

Seems it's stored in the connection object as @host, @scheme and @port, so I'd need to give cloud_io/s3 a method to pull this out via #instance_variable_get or something?

@tombruijn
Copy link
Member

I tried to fetch the data from the fog connection as cloud_io/s3 doesn't know anything about this specifically. I have to fetch it too deep from private variables and methods from fog to get anything useful, so I can't recommend it.

My best guess would then be to fetch it from the user set fog options and fallback to defaults

url = "#{fog_options[:scheme] || "https"}://#{fog_options[:host] || "s3.amazonaws.com"}:#{fog_options[:port] || "80"}/#{ bucket }/#{ dest }"

(untested)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants