Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure Storage Support #74

Open
hari-narayanan94 opened this issue Mar 9, 2022 · 7 comments
Open

Azure Storage Support #74

hari-narayanan94 opened this issue Mar 9, 2022 · 7 comments

Comments

@hari-narayanan94
Copy link

This Tool looks like the exact thing i need now for my AKS Clusters , but unfortunately Im not able to find any option to send the core dump to any Native Azure Solution . I see s3 is supported and is there any way to send those dumps over to Azure storage Blob or File share ? Azure natively doesn't support s3 protocol

@No9
Copy link
Collaborator

No9 commented Mar 9, 2022

Hey @hari-narayanan94
Thanks for the feedback.
If you needed this today you could look at a the microsoft opensource blog that suggests using a minio shim to create an S3 compatible API for blob.
https://cloudblogs.microsoft.com/opensource/2017/11/09/s3cmd-amazon-s3-compatible-apps-azure-storage/#:~:text=Using%20s3cmd%20and%20other%20Amazon%20S3%2Dcompatible%20apps%20with%20Azure%20Blob%20Storage,-November%209%2C%202017&text=Object%20storage%20is%20one%20of,unstructured%20data%2C%20conveniently%20and%20flexibly.

Medium Term (Next few of months) I'll look at the azure sdk https://crates.io/crates/azure_storage_blobs so that it's a first class citizen.

Also note that you can disable the S3 Storage upload aspect and provide your own uploader as documented here.
https://github.com/IBM/core-dump-handler/blob/main/FAQ.md#how-should-i-integrate-my-own-uploader
It's a bit cumbersome as you have to deal with file semantics right now but I plan to integrate and event api as part of the next release that should make this a lot more straight forward.
#61

@hari-narayanan94
Copy link
Author

Thanks for the Quick response , I will try to get it setup on my AKS Cluster with the workaround today and share my experiences .

@hari-narayanan94
Copy link
Author

I tried the Minio and everything looks to be working on that side and now i have a functioning s3 endpoint . When i installed core dump hander on my AKS Cluster and simulated a crash , the ZIP gets generated but it fails to upload with the following error .

[2022-03-10T14:25:22Z INFO core_dump_agent] Uploading: /var/mnt/core-dump-handler/cores/3d7bc367-569a-497e-b929-3b6c33b5dd8c-dump-1646921760-segfaulter-segfaulter-1-4.zip
[2022-03-10T14:25:22Z INFO core_dump_agent] zip size is 28070
[2022-03-10T14:25:22Z ERROR core_dump_agent] Upload Failed custom: missing field Bucket
[2022-03-10T14:25:22Z INFO core_dump_agent] Uploading: /var/mnt/core-dump-handler/cores/75c2971f-d165-497c-955b-93be45edde1a-dump-1646921675-segfaulter-segfaulter-1-4.zip
[2022-03-10T14:25:22Z INFO core_dump_agent] zip size is 28085
[2022-03-10T14:25:23Z ERROR core_dump_agent] Upload Failed custom: missing field Bucket
[2022-03-10T14:25:23Z INFO core_dump_agent] Uploading: /var/mnt/core-dump-handler/cores/e1184d7f-f110-49b7-ad50-a87bbb5c005e-dump-1646921436-segfaulter-segfaulter-1-4.zip
[2022-03-10T14:25:23Z INFO core_dump_agent] zip size is 28087
[2022-03-10T14:25:23Z ERROR core_dump_agent] Upload Failed custom: missing field Bucket
[2022-03-10T14:25:23Z INFO core_dump_agent] INotify Starting...
[2022-03-10T14:25:23Z INFO core_dump_agent] INotify Initialised...
[2022-03-10T14:25:23Z INFO core_dump_agent] INotify watching : /var/mnt/core-dump-handler/cores
[2022-03-10T14:25:54Z INFO core_dump_agent] Uploading: /var/mnt/core-dump-handler/cores/67e1aa29-d87f-44c9-9bb5-6f8f71e72a4b-dump-1646922354-segfaulter-segfaulter-1-4.zip
[2022-03-10T14:25:54Z INFO core_dump_agent] zip size is 28071
[2022-03-10T14:25:55Z ERROR core_dump_agent] Upload Failed custom: missing field Bucket

@No9
Copy link
Collaborator

No9 commented Mar 10, 2022

Did you install the chart with the bucket option?
--set daemonset.s3BucketName=NAME_OF_BUCKET
And created the bucket in the minio server?

@hari-narayanan94
Copy link
Author

hari-narayanan94 commented Mar 10, 2022

Yes , I did use that while applying the helm , I used the Webapp URL as the s3bucketname . Below is the helm installation command i used

helm.exe install my-core-dump-handler core-dump-handler/core-dump-handler --set daemonset.s3AccessKey=storageaccountname --set daemonset.s3Secret=xxxxx --set daemonset.s3BucketName=https://xxxx.azurewebsites.net --set daemonset.s3Region=us-east-1

FYI , Since I only have Azure I used the suggested link to expose s3 protocol for my storage account using minio shim .

https://cloudblogs.microsoft.com/opensource/2017/11/09/s3cmd-amazon-s3-compatible-apps-azure-storage/#:~:text=Using%20s3cmd%20and%20other%20Amazon%20S3%2Dcompatible%20apps%20with%20Azure%20Blob%20Storage,-November%209%2C%202017&text=Object%20storage%20is%20one%20of,unstructured%20data%2C%20conveniently%20and%20flexibly.

I dont see anywhere in minio where it asks me to setup a bucket name !! When i try to verify minio server by logging in via a s3 browser it shows me all the blobs in the storage account .

@No9
Copy link
Collaborator

No9 commented Mar 11, 2022

OK so using https://xxxx.azurewebsites.net/ as a bucket won't work.

According to the article you should be able to use the s3cmd to make a bucket.

$ ./s3cmd mb s3://testbucket

You should then be able to use this option --set daemonset.s3BucketName=testbucket

@No9
Copy link
Collaborator

No9 commented Dec 30, 2022

As external events landed in the v8.9.0 release the ground work is in place to create an agent that uploads to azure using azure blobs library or any other azure integration. The idea is to disable the uploading agent by setting useINotify: false and implement a container that looks for files in the event folder
once events has been enabled
I won't be starting work on the azure agent anytime soon but happy to work with someone who picks it up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants