In GCP (not firebase) I have a bucket and a function which will be called when a new file was created in the bucket. Works great.
/**
* Triggered from a change to a Cloud Storage bucket.
*/
exports.mycompany_upload_file = (event, context) => {
const gcsEvent = event;
const filename = gcsEvent.name;
const bucketName = event.bucket;
console.log(`=====================================================`);
console.log(`Event Type: ${context.eventType}`);
console.log(`Bucket: ${bucketName}`);
console.log(`Datei: ${filename}`);
console.log(`=====================================================`);
// now I would like to open that file and read it line-by-line
How do I address that file? What's the path of that file? Can I use standard node libraries like 'fs'
I found this post that might help you, but looks like basically the answer is as follows:
Note that it will store the result in a ramdisk, so you'll need enough RAM available to your function to download the file.
var storage = require('@google-cloud/storage');
const gcs = storage({projectId: "<your_project>"});
const bucket = gcs.bucket("<your_bucket>");
const file = bucket.file("<path/to/your_file>")
exports.gcstest = (event, callback) => {
file.download({destination:"/tmp/test"}, function(err, file) {
if (err) {console.log(err)}
else{callback();}
})
};
For more information you can check the documentation for Google Cloud Storage: Node.js Client
I hoped there's a way to directly read the file in the bucket without making a copy to (ram) disk
Any processing must be done in RAM somewhere, Cloud Storage is not a solution to perform computation, if you check the features of the tool it does not include processing.