AWS

Hi,

been reviewing the AWS posts, but am still struggling to get it working correctly.

in ewcfg: (tried without the uploads folder…)
UPLOAD_DEST_PATH" => "s3://mybucket/uploads/

have a key and code ( just put these items int he php code, i didn’t see anywhere else on aws to assign these creds to that “mybucket”)
set bucket to public access
created a folder called uploads

debug:
PATH
“/uploads/temp__0p9uun262n9fgt9178fetkivjh/client_docs/x_atachment/”

host:
“mybucket.s3.canada (central) ca-central-1.amazonaws.com

message:
"cURL error 3: = CURLE_URL_MALFORMAT (3) The URL was not properly formatted.


full error: rasied class aws\s3\exception\s3exception with message “error executing “headobject” on mybucket/uploads/temp_ …” at wrappterhttphander.php line195

any advice/insight appreciated.

  1. In your project you need to include AWS SDK for PHP, see the topic Server Events and Client Scripts → Server Events → “Global → All Pages” → Global Code → Example 2 in the help file.

  2. host:

“mybucket.s3.canada (central) ca-central-1.amazonaws.com

Where did you see that? This does not work correct.

From the example, the host should be defined with AWS SDK for PHP like:

$s3client = new \Aws\S3\S3Client([
“version” => “latest”,
“region” => “ca-central-1” // Change to your own region
//, “http” => [“verify” => FALSE] // Disable certificate verification (this is insecure!)
]);

Hi Arbei,

  1. In your project you need to include AWS SDK for PHP, see the topic Server Events and Client Scripts → Server Events → “Global → All Pages” → Global Code → Example 2 in the help file.

that’s done

  1. host:

“mybucket.s3.canada (central) ca-central-1.amazonaws.com

Where did you see that? This does not work correct.

i copied it from the AWS account screen, i will change as noted below

$s3client = new \Aws\S3\S3Client([
“version” => “latest”,
“region” => “ca-central-1” // Change to your own region
//, “http” => [“verify” => FALSE] // Disable certificate verification (this is insecure!)
]);

thanks

seems a little better… (bucket names changed to protect he innocent)

getting closer now getting a 404 not found response back from guzzlehttp

“Client error: HEAD mybucket-2.s3.ca-central-1.amazonaws.com/uploads/AAMI%5Cdocuments%5C10001236/IMAGE.jpg resulted in a 404 Not Found response”

i have the paths set as

“UPLOAD_TEMP_PATH” => “uploads/”, // Upload temp path (absolute local physical path)
“UPLOAD_DEST_PATH” => “s3://mybucket/uploads/”, // Upload destination path (relative to app root)

i updated this
“UPLOAD_TEMP_HREF_PATH” => “uploads”, // Upload temp href path (absolute URL path for download)

and the add document page now appeared and went through the motions of uploading a file… but I have no clue to where. when I checked the S3 storage nothing is there… and the local uploads folder is empty except for the temp__ folders…

For public addess of the bucket under ACL everyone has nothing enabled.

Read help file → Tools → Advanced Settings → File upload URL path / File upload path (absolute) for temporary files / File upload URL path (absolute) for temporary files

got it going thanks arbei.

changed as for testing:
“UPLOAD_TEMP_PATH” => “/tmp”, // Upload temp path (absolute local physical path)
“UPLOAD_TEMP_HREF_PATH” => “/tmp”, // Upload temp href path (absolute URL path for download)

then the other issue was my path that is built on the upload…

developing on windows the PATH_DELIMITER = "", so when the upload was happening it was replacing the "" with the escaped code (%5C) and was failing, once i changed it to “/” the upload worked. not sure it that’s true or not, but it the was able to upload and view the test files…

I pushed the code to my dev linux box with the original path string using “PATH_DELIMITER” and all was fine.

thanks,