A project aiming to fight climate change by providing a way to incentivise growth in the entire solar industry with a single transaction. Solar energy producers are paid in stablecoins and funders are rewarded with native tokens based on the amount of energy produced.
https://devpost.com/software/soleil
https://www.youtube.com/watch?v=9V7hv01URL8
https://festive-wescoff-a5dded.netlify.app/
- Set
CHAINLINK_NODE_ADDRESS
environment variable with the address of the node which will be submitting payment merkle roots to the Pool Manager contract. - Set
ALCHEMY_API_KEY
andPRIVATE_KEY
environment variables with details to allow deployment of contracts to Rinkeby. - Compile contracts with
npx hardhat compile
. - Run tests with
npx hardhat test
. - Grab some Rinkeby ETH from a faucet and run
npx hardhat run scripts/deploy.ts --network rinkeby
. - Paste the deployed pool manager contract address to /web-app/src/poolManagerContract.json.
- Paste the deployed pool manager contract address to /dai-earnings-calculator-ea/poolManagerContract.json.
- Create a Moralis server instance.
- Set
REACT_APP_MORALIS_APPLICATION_ID
andREACT_APP_MORALIS_SERVER_URL
environment variables with details from the Moralis server instance. - Set the
REACT_APP_CERAMIC_API_URL
environment variables for /web-app with a Ceramic node/gateway url on the Clay testnet. - Set the
REACT_APP_RPC_URL
environment variable to your Rinkeby RPC node url. - Run
yarn install && yarn start
.
This external adapter populates the energy production data stream on Ceramic. To do this it reads the API credentials and Ethereum address for solar sites registered in the Moralis database connected to the web app and uses them to query energy production data from the monitoring API provider. Currently the only supported API provider is SolarEdge - see their API docs here. Nonsensitive energy production figures are published to the data stream alongside the site's Ethereum address.
Deploy this external adapter if you want to populate your own Ceramic data stream with energy data from solar sites which register in your Moralis database through your web app. To do so:
- Either generate a seed or use an existing seed for your Ceramic authentication. Set the
SEED
environment variable to this seed. - Set the
CERAMIC_API_URL
environment variable with a read/write node on the Ceramic Clay testnet. - Set
MORALIS_APPLICATION_ID
andMORALIS_SERVER_URL
environment variables with details from your Moralis server instance. - Run
npm run bootstrap
to pin the bootstrap your Ceramic schema and data streams. Stream IDs will be written toconfig.json
files in all projects which need them. - Deploy the adapter to a cloud hosting solution or run it locally.
- Create a bridge between your Chainlink node and the external adapter
- Create a CRON job for the adapter. Here's an example:
type = "cron"
schemaVersion = 1
name = "data-feed-hourly-cron"
schedule = "CRON_TZ=UTC 0 0 * * * *"
observationSource = """
fetch [type=bridge name="YOUR_BRIDGE_NAME" requestData="{\\"id\\": \\"0\\"}"]
fetch
"""
This process is currently centralised and would need to be decentralised between a number of nodes using a process similar to current off-chain reporting.
This external adapter reads total daily DAI distribution data from the pool manager smart contract and energy production data from the Ceramic data stream. It is then able to calculate how much DAI each solar site has earned and publishes this data to the cumulative DAI earnings Ceramic data stream. The merkle root of this cumulative earnings data is then returned to the Chainlink node to be submitted to the pool manager contract. Multicall is used to reduce the number of calls made to the RPC node.
- Either generate a seed or use an existing seed for your Ceramic authentication. Set the
SEED
environment variable to this seed. - Set the
CERAMIC_API_URL
environment variable with a read/write node on the Ceramic Clay testnet. - Set the
RPC_URL
environment variable to your Rinkeby RPC node url. - Deploy the adapter to a cloud hosting solution or run it locally.
- Create a bridge between your Chainlink node and the external adapter
- Create a CRON job for the adapter. Here's an example:
type = "cron"
schemaVersion = 1
name = "dai-earnings-calculator-hourly-5-mins-offset-cron"
schedule = "CRON_TZ=UTC 0 5 * * * *"
observationSource = """
fetch [type=bridge name="YOUR_BRIDGE_NAME" requestData="{\\"id\\": \\"0\\"}"]
parse [type=jsonparse path="data,result" data="$(fetch)"]
encode_data [type=ethabiencode abi="submitDaiMerkleRoot(bytes32 _root)" data="{ \\"_root\\": $(parse) }"]
submit_tx [type=ethtx to="YOUR_POOL_MANAGER_CONTRACT_ADDRESS" data="$(encode_data)"]
fetch -> parse -> encode_data -> submit_tx
"""
This external adapter reads daily scheduled DAI distribution data from the pool manager smart contract so it can calculate the percentage each user has contributed to the total daily DAI distributions. It then reads the energy production data from the Ceramic data stream and calculates the SLL rewards per user using the following formula:
(User's daily DAI distribution * Total energy produced that day) / (1000 * Total daily DAI distribution)
The SLL rewards data is then published to the cumulative SLL earnings data stream. The merkle root of this data is then returned to the Chainlink node to be submitted to the pool manager contract. Multicall is used to reduce the number of calls made to the RPC node.
- Either generate a seed or use an existing seed for your Ceramic authentication. Set the
SEED
environment variable to this seed. - Set the
CERAMIC_API_URL
environment variable with a read/write node on the Ceramic Clay testnet. - Set the
RPC_URL
environment variable to your Rinkeby RPC node url. - Set
MORALIS_APPLICATION_ID
,MORALIS_SERVER_URL
andMORALIS_MASTER_KEY
environment variables with details from your Moralis server instance. - Deploy the adapter to a cloud hosting solution or run it locally.
- Create a bridge between your Chainlink node and the external adapter
- Create a CRON job for the adapter. Here's an example:
type = "cron"
schemaVersion = 1
name = "sll-rewards-calculator-hourly-5-mins-offset-cron"
schedule = "CRON_TZ=UTC 0 5 * * * *"
observationSource = """
fetch [type=bridge name="YOUR_BRIDGE_NAME" requestData="{\\"id\\": \\"0\\"}"]
parse [type=jsonparse path="data,result" data="$(fetch)"]
encode_data [type=ethabiencode abi="submitSllMerkleRoot(bytes32 _root)" data="{ \\"_root\\": $(parse) }"]
submit_tx [type=ethtx to="YOUR_POOL_MANAGER_CONTRACT_ADDRESS" data="$(encode_data)"]
fetch -> parse -> encode_data -> submit_tx
"""