-
Notifications
You must be signed in to change notification settings - Fork 472
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New plugin for Downloading internal repositories packages in case same name exists in external as well. #279
Conversation
Thanks for the contribution! But, I'm not sure I understand the purpose. Why not just use a virtual repository? |
If its an npm repository and the packages are of the same name, this is not going to work, For example: Suppose that we have 1 local repository npm-local and one remote repository npm-remote. Assume that there is a package node-package in both of these repositories. As there were a lot of internal automated scripts that use this npm install node-package command , the only option was to have a user plugin which will kind of force the download of the internal package with version Hence this plugin. |
I'm still a little confused.
Is this not because that's the repository with the latest version? I expect if 2.x.x was in npm-local and 1.x.x was in npm-remote, it should always give you the package from npm-local. Or if they both had 2.x.x, it should always give you whichever package came first in the virtual list. Is this not the case?
But the internal package is version 1.x.x, not 2.x.x, right? This plugin will look for 2.x.x in npm-local, and will not find it. Then it will allow the package to be taken from npm-remote. This is the same behavior you would see without the plugin. Please let me know what I'm missing. Thanks! |
Hi @ginokurian87 , Thanks for your contribution. I think I share @DarthFennec confusion. It seems to me that the virtual behavior of fetching the newer version from the remote should be the way to go, since you did not informed a specific version in your One think to note is that a virtual repo will always try to resolve from the locals before reaching the remotes, despite of the order of the repos in the virtual configuration. If you try to fetch a dynamic version though, like the latest for the given Also keep in mind that, for a dynamic version like that, Artifactory will serve the latest version it can find. So if the remote is not available, it will serve the latest version present in the local repo. With all that said, I'm not being able to see where a plugin like the one you are proposing fits in the picture. |
Hi @elioengcomp , @DarthFennec I do not want the ideal behaviour, i always want artifactory to fetch me the latest in the local repository whatever be the version in the remote repo. Hence in this case i'm slightly altering/extending the way in which artifactory works with this plugin. |
@elioengcomp concur with @ginokurian87 that this is a valid use case. It is an unusual one, but it is not the first time we have heard of a request like this. The critical part, in my view is that the README document well that this changes the normal behavior, and gives a little context as to why. @ginokurian87 in specific if you could update the readme to very explicitly state something like "This plugin changes the normal behavior of virtual repositories" and then add a why like "to address circumstances where an internal local package may have a name conflict with a public one and is still desired (although best practice remains to avoid such conflicts)" or some such? |
RepoPath newPath = null; | ||
boolean foundInLocal = false; | ||
//change it to match only the external npm name | ||
if (repoKey.contains("npm") && externalNpmRepo_prop.contains(repoKey)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is externalNpmRepo_prop
? I don't see that defined anywhere.
The repo name may contain "npm" even if it isn't an NPM repository, and it may not contain "npm" even if it is. I think a better approach would be to check the type of the repo, rather than its name:
def repo = repositories.getRepositoryConfiguration(repoKey)
if (repo.type == "remote" && repo.packageType == "npm") {
// ...
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will agree on this point, as it will make the plugin generic. But we wanted specific filtering based on names for now.
Since the plugin is going to be generic i will correct and upload a new one.
|
||
download { | ||
|
||
altRemoteContent { repoPath -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is the wrong hook. altRemoteContent
triggers at the boundary between Artifactory and a remote location, not at the one between a user and Artifactory. One effect of this is that the alternate content is saved in the remote cache in Artifactory, which I can see causing problems. Say someone downloads a remote package foo 3.4
, and then someone else uploads a local package foo 1.0
. The next time someone downloads foo
, they'll get foo 3.4
from the remote repository cache, and the plugin won't run. Or say someone downloads foo 1.0
, which gets cached, and then the local foo 1.0
is removed. The next time someone downloads foo
, they'll get foo 1.0
from the cache instead of foo 3.4
.
The altResponse
hook seems more correct here, to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@DarthFennec : Thanks for pointing this out.
The requirement for us was to always take the foo
package in the internal repo. No external requests for foo would be permitted. In this case, if someone tries to specifically get foo 3.4
they should get a 404 . In the above scenario the package foo 3.4
will never be in the external cache. So in the external cache we will always have the latest version of local package.
Also i do not think that we will not be removing the packages from the local repository.
Another reason for choosing this hook was to minimise the number of calls to plugin and increase the response time for all other calls.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The requirement for us was to always take the
foo
package in the internal repo.
That's of course assuming a foo
package exists in the internal repo; my first example was about if the remote foo
package was cached before the first internal version of foo
was added to the local repo. Beyond that, I agree with your logic, so it's fine if this stays as-is.
if someone tries to specifically get
foo 3.4
they should get a 404 .
Good to know. In that case, some of the rest of the code is wrong; it seems to check specifically for the requested version, not just the package name. I'll take another look and see if I can make more specific comments.
Uploaded a new version addressing review comments. |
|
Thank you for contribution. As we are minimizing our effort on plugin, we will not be able to support this request, We recommend you to move towards JFrog Workers. Closing this PR |
No description provided.