Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to get S3FileAdapter to work on latest version of parse-server #1483

Closed
drnelson opened this issue Apr 14, 2016 · 11 comments
Closed

Unable to get S3FileAdapter to work on latest version of parse-server #1483

drnelson opened this issue Apr 14, 2016 · 11 comments

Comments

@drnelson
Copy link

I currently have a parse-server deployment working however every time I upload a file it stores it with the GridStore file adapter instead of the S3 one. My app JS looks like this:

// Example express application adding the parse-server module to expose Parse
// compatible API routes.

var express = require('express');
var ParseServer = require('parse-server').ParseServer;
var path = require('path');

var S3Adapter = require('parse-server').S3Adapter;

var databaseUri = process.env.DATABASE_URI || process.env.MONGOLAB_URI;

if (!databaseUri) {
  console.log('DATABASE_URI not specified, falling back to localhost.');
}

var api = new ParseServer({
  databaseURI: databaseUri || 'mongodb://localhost:27017/dev',
  cloud: process.env.CLOUD_CODE_MAIN || __dirname + '/cloud/main.js',
  appId: process.env.APP_ID || 'somAppId',
  masterKey: process.env.MASTER_KEY || '', //Add your master key here. Keep it secret!
  serverURL: process.env.SERVER_URL || 'http://localhost:1337/parse',  // Don't forget to change to https if needed
  liveQuery: {
    classNames: ["Posts", "Comments"] // List of classes to support for query subscriptions
  },
  filesAdapter: new S3Adapter(
    "S31",
    "S32",
    "bucket-name",
    {directAccess: true}
  )
});
// Client-keys like the javascript key or the .NET key are not necessary with parse-server
// If you wish you require them, you can set them as options in the initialization above:
// javascriptKey, restAPIKey, dotNetKey, clientKey

var app = express();

// Serve static assets from the /public folder
app.use('/public', express.static(path.join(__dirname, '/public')));

// Serve the Parse API on the /parse URL prefix
var mountPath = process.env.PARSE_MOUNT || '/parse';
app.use(mountPath, api);

// Parse Server plays nicely with the rest of your web routes
app.get('/', function(req, res) {
  res.status(200).send('Make sure to star the parse-server repo on GitHub!');
});

// There will be a test page available on the /test path of your server url
// Remove this before launching your app
app.get('/test', function(req, res) {
  res.sendFile(path.join(__dirname, '/public/test.html'));
});

var port = process.env.PORT || 1337;
var httpServer = require('http').createServer(app);
httpServer.listen(port, function() {
    console.log('parse-server-example running on port ' + port + '.');
});

// This will enable the Live Query real-time server
ParseServer.createLiveQueryServer(httpServer);

S1 and S2 are the key/secret of the bucket. The bucket is working and I have tested the security profile as well. I'm not getting any errors in the logs either.

After I deploy I do not get any errors. Everything seems to be running fine. However, when I upload a file it goes to my monglab DB instead of my S3 storage. Do I need to do something else to configure this? I'm using the docs as a reference to configure this:

https://github.com/ParsePlatform/parse-server/wiki/Configuring-File-Adapters

@deszip
Copy link

deszip commented Apr 14, 2016

As stated here:
http://blog.parse.com/announcements/hosting-files-on-parse-server/
you have to provide 'file key' in parse-server init.

@xor22h
Copy link

xor22h commented Apr 26, 2016

Another issue could be region; S3 Buckets in Frankfurt supports only v4 signatures;

Error from S3Adapter: "The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256."

@5amfung
Copy link

5amfung commented Apr 26, 2016

I have the same issue. I'm using Standard Region, which I believe it's us-east-1 by default and should work. Also I'm not migrating any files. It's a new setup. All my files are still written to mongo for some reason. Here's my config setup.

    filesAdapter: {
        module: "parse-server-s3-adapter",
        option: {
            accessKey: process.env.S3_ACCESS_KEY || '',
            secretKey: process.env.S3_SECRET_KEY || '',
            bucket: process.env.S3_BUCKET || '',
            directAccess: true
        }
    },

@5amfung
Copy link

5amfung commented Apr 26, 2016

My previous configuration didn't work but the following worked though.

filesAdapter: new S3Adapter({
        accessKey: process.env.S3_ACCESS_KEY || '',
        secretKey: process.env.S3_SECRET_KEY || '',
        bucket: process.env.S3_BUCKET || ''
    }),

@tingham
Copy link

tingham commented May 18, 2016

I'm doing a manual migration of images and other file data from parse to s3 storage. To do this I built a simple script that connects to the existing parse instance and fetches my records, then pushes the file data up to the s3 bucket.

I then changed the configuration of my parse-server instance to utilize the S3Adapter instance. Currently when I query a document out of my self-hosted parse-server I would expect to see my amazon url in the file url. What I'm seeing instead is http://files.parsetfss.com/invalid-file-key/tfss-###-file

I've matched the configuration structure indicated by @5amfung and tried with implementations that specify directAccess: true, with and without a fileKey - with the same result.

I also added some debug output to the getFileLocation in the s3 adapter code and I never see any output. It's as-if the parse-server isn't using the adapter when I'm fetching files.

EDIT

It looks like if I update the stored filename in the field after submission to S3 to something that's not tfss then the S3 adapter "kicks-in." Weird.

@kvreem
Copy link

kvreem commented Jun 1, 2016

@tingham i have an issue referenced #1582 where all the new post migration files are unaccessible to users who do not update to the migrated version, do you have a work around for this issue?

@tingham
Copy link

tingham commented Jun 1, 2016

@kkhattab I've written a script that can perform a one-time manual migration of files from parse.com to (in my case heroku + S3.) By specifying multiple DSN I've been able to confirm that this process functions but given the risk inherent in running it on live data I'm hesitant to publish it.

@kvreem
Copy link

kvreem commented Jun 1, 2016

@tingham hmmm, so would clients that don't upgrade able to see the new files, or is this simply migrated all existing files to s3?

I too have a heroku + s3 for my parse server

@tingham
Copy link

tingham commented Jun 1, 2016

@kkhattab That is one of the additional issues we're dealing with. What becomes readily apparent is that storing an absolute path to the finalize file would have been preferable to using "File" references. It's definitely a mess.

@flovilmart
Copy link
Contributor

On the original issue, which was a configuration problem, has it been solved?

@flovilmart
Copy link
Contributor

Closing due to lack of activity, please reopen if the issue persist.

Don't forget to include your current:

  • node version
  • npm version
  • parse-server version
  • any relevant logs (VERBOSE=1 will enable verbose logging)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants