Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question #1

Open
nssr opened this issue Feb 24, 2013 · 5 comments
Open

Question #1

nssr opened this issue Feb 24, 2013 · 5 comments
Assignees

Comments

@nssr
Copy link

nssr commented Feb 24, 2013

Hi,

Can you give some instructions of how to use this code, functions? And, which function saves the data in train.csv and test.csv files?

Thanks.

@syllogismos
Copy link
Owner

I will explain in detail how to use the code.. mean while read the paper "Supervised Radom Walks" so that you too can help me if there are any bugs.. I will try to do it before next sunday :)

@nssr
Copy link
Author

nssr commented Apr 7, 2013

Hi,
Again me :). Like I understand from the document, the final result is a derivative of function F(w) with respect to
w, but I can't find it in your code.
If I am wrong, please could you tell me which function gives the final result? ( I thing one from this : DifferenceIndices.m, GetNodesFromParam.m or train.m, but I am not sure).
Thanks.

@syllogismos
Copy link
Owner

@nssr lossfunction.m does that..

if you look at the descriptions in all the files in the beginning, you would understand it fairly easily..
lossfunction.m returns the loss for a particular set of parameters and the gradients for each parameter.. you feed these into fmincg.m to get us the optimum set of parameters..

fmincg is like fminunc..
type "help fminunc" in octave window to learn what it does..

you can use similar kind of functions to solve for the parameters..
the way fminunc/fmincg works is that it takes a (function like lossfunction.m that calculates loss and gradient wrt all parameters given the parameters) and (initial values of the parameters to start the gradient descent) and (number of iterations) etc.. hope you understood..

im working on writing the detailed description of how all of it works..

you can read the file description in all the files to understand them each..
i will clean my code up a little and show how to run it on a sample network soon..

@ghost ghost assigned syllogismos Apr 8, 2013
@syllogismos
Copy link
Owner

train.m(here the code is incomplete a little bit, if you understand how fminunc works you will write it yourself) trains from the network and gives us the parameters..
getnodesfromparam.m gives the top nodes that are likely to be new friends from the parameters obtained from train.m

differenceindices.m is some helper function..

@syllogismos
Copy link
Owner

@nssr http://syllogismos.github.io/facebook-link-prediction/ hope this helps.. jump to the end to know how to use my code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants