-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate to pyenergyplus #541
base: develop
Are you sure you want to change the base?
Conversation
@@ -164,6 +164,13 @@ router.get("/runs/:runId/time", async (req, res, next) => { | |||
.catch(next); | |||
}); | |||
|
|||
router.get("/runs/:runId/log", async (req, res, next) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would allow job/run logs to be accessed from the web api.
RUN pip3.8 install virtualenv \ | ||
&& pip3.8 install \ | ||
scipy \ | ||
symfit |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The worker is no longer providing builtin python dependencies outside of what base alfalfa needs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is partially due to removing the refrig_case
test which required symfit
. However symfit
doesn't support python 3.12 and the project in general seems semi-abandoned so I removed the test case and replaced it with a small office based test model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So in other words, if someone has an EnergyPlus model with Python EMS that uses third party Python modules, they're out of luck with Alfalfa. Is that right? It seems reasonable, as Alfalfa cannot anticipate every module that someone might want to use.
Can we say that Alfalfa supports any module that comes bundled with EnergyPlus?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not quite. https://github.com/NREL/alfalfa/wiki/How-to-Migrate-EnergyPlus-Python-Plugins you can provide a requirements.txt
which results in a virtual environment being created for your model which has your requirements installed.
|
||
if __name__ == '__main__': | ||
|
||
basicConfig(level=os.environ.get("LOGLEVEL", "INFO"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Output logs at LOGLEVEL
and above to stdout.
from .create_run import CreateRun | ||
from .step_run import StepRun | ||
# from .create_run import CreateRun | ||
# from .step_run import StepRun |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed to prevent source files from being automatically loaded.
The __init__.py
is run when the module gets imported, and in the dispatcher unit tests this can cause it to crash as StepRun
has dependencies (openstudio
, pyenergyplus
) which may not be present on a developer's computer and are not present in the CI runner.
This could be avoided by using a job other than CreateRun
to confirm that the dispatcher is able to properly resolve job paths.
@@ -33,18 +31,18 @@ def exec(self): | |||
# If there are requirements.txt files in the model create a python virtual environment and install packaged there | |||
requirements = self.run.glob("**/requirements.txt") | |||
if len(requirements) > 0: | |||
check_output(["python3.8", "-m", "venv", "--system-site-packages", "--symlinks", str(self.dir / '.venv')]) | |||
self.log_subprocess(["python3.12", "-m", "venv", "--system-site-packages", "--symlinks", str(self.dir / '.venv')]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
log_subprocess
takes the output of the subprocess and logs it. This makes logs more useful as they include more of the output from the job.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not related to the logging, but perhaps it makes sense to not reference python by exact version. This seems like unnecessary repetition. Why not just link python or python3 to the version you want in the container setup?
hour_index: int = 1 | ||
minute_index: int = 2 | ||
month_index: int = 3 | ||
|
||
def __init__(self, run: Run) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add better documentation to this class.
|
||
|
||
@dataclass | ||
class Options: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consolidated options for StepRunBase
extending jobs into one place.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: document each of these options better
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe that is an opportunity to consolidate the class hierarchy a little?
"""Placeholder for updating the datetime in Mongo to current simulation time""" | ||
self.logger.info("Advancing to start time") | ||
while self.run.sim_time < self.options.start_datetime: | ||
self.logger.info("Calling advance") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make this a debug message, and increase the verbosity.
class Job(metaclass=JobMetaclass): | ||
"""Job Base Class""" | ||
|
||
@error_wrapper |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the only place this is used. This can probably be moved into the function body.
@@ -86,7 +86,7 @@ def __init__(self, **kwargs): | |||
self.u_store = copy.deepcopy(self.u) | |||
# Set default options | |||
self.options = self.fmu.simulate_options() | |||
self.options['CVode_options']['rtol'] = 1e-6 | |||
# self.options['CVode_options']['rtol'] = 1e-6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CVode no longer included by default. FMUs must provide solver.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TShapinsky @kbenne If we are changing expected format/requirements for FMU upload, we should update docs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed FMUs for "co-simulation" are now the standard for BOPTEST and I guess here too and indeed that means the solver is baked into the FMU.
I think this PR is key context.
- mc | ||
- mongo | ||
- redis | ||
- worker |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed redundant configuration.
@@ -72,6 +73,7 @@ def test_point_retrieval(base_url, run_id, alfalfa_client): | |||
for point in outputs + bidirectionals: | |||
response = requests.get(f"{base_url}/runs/{run_id}/points/{point['id']}") | |||
|
|||
response.raise_for_status() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this raises an exception if the return code isn't a 2XX.
@@ -179,7 +182,7 @@ def test_point_writes(base_url, run_id): | |||
request_body = { | |||
'value': "hello" | |||
} | |||
response = requests.put(f"{base_url}/runs/{run_id}/points/{inputs[0]['id']}", json=request_body) | |||
response = requests.put(f"{base_url}/runs/{run_id}/points/{bidirectionals[0]['id']}", json=request_body) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
switched as default model no longer has any pure inputs. Only bidirectional points.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @TShapinsky and @anyaelena. I left a lot of comments for thought, but this seems like a good PR and I'm excited to see it merged.
Nice job @TShapinsky
@@ -92,6 +92,11 @@ class AlfalfaAPI { | |||
return await getHashValue(this.redis, run.ref_id, "sim_time"); | |||
}; | |||
|
|||
getRunLog = async (run) => { | |||
const log_lines = await this.redis.lRange(`run:${run.ref_id}:log`, -100, -1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice. I like the simple design of streaming the log into redis. is this new or is only the api endpoint new? I guess I'll find out as I go through this PR.
The only question that comes to mind is what might the load look like as the scale gets big? Redis is powerful, but I sense these logs could get verbose. Nevertheless, good stuff, and if there is a performance impact I'm sure it can be mitigated by filtering the logs or something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question. LRANGE
is O(S+N) where S is the start offset, and N is the number of elements. So it will currently grow linearly. However, if we are interested in getting only a few of the most recent lines we could switch from doing an RPUSH
to LPUSH
. That way we wouldn't need any offset to retrieve the log. This would make the writes and reads both constant complexity operations so there wouldn't be any issue with big logs, outside of storage space.
@@ -197,7 +203,7 @@ class AlfalfaAPI { | |||
|
|||
const { startDatetime, endDatetime, timescale, realtime, externalClock } = data; | |||
|
|||
const job = `alfalfa_worker.jobs.${sim_type === "MODELICA" ? "modelica" : "openstudio"}.StepRun`; | |||
const job = `alfalfa_worker.jobs.${sim_type === "MODELICA" ? "modelica" : "openstudio"}.step_run.StepRun`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.step_run.StepRun
I'm sure there is a motivation here. Why did you add the seemingly redundant job name?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh I see, now that I'm further along in the review. This corresponds to your module structure.
RUN pip3.8 install virtualenv \ | ||
&& pip3.8 install \ | ||
scipy \ | ||
symfit |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So in other words, if someone has an EnergyPlus model with Python EMS that uses third party Python modules, they're out of luck with Alfalfa. Is that right? It seems reasonable, as Alfalfa cannot anticipate every module that someone might want to use.
Can we say that Alfalfa supports any module that comes bundled with EnergyPlus?
@@ -34,99 +28,11 @@ def exec(self): | |||
""" | |||
self.logger.info("add_fmu for {}".format(self.run.ref_id)) | |||
|
|||
# Create the FMU tags (no longer external now that python2 is deprecated) | |||
self.create_tags() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the key change with all of these deleted lines of code, is that the FMU is supposed to come with the tags metadata baked into the FMU's resources directory. Is that correct? This seems to be aligned with the BOPTEST convention.
sim_year = self.start_datetime.year | ||
self.sim_start_time = (self.start_datetime - datetime(sim_year, 1, 1)) / timedelta(seconds=1) | ||
self.sim_end_time = (self.end_datetime - datetime(sim_year, 1, 1)) / timedelta(seconds=1) | ||
sim_year = self.options.start_datetime.year |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just some refactoring to accommodate the options
structure, correct?
self.ep.outputs = outputs | ||
|
||
def step(self): | ||
def start_simulation_process(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and critically, this functions blocks while EnergyPlus is running so it is started in another Process. You might make a note about that.
|
||
|
||
@dataclass | ||
class Options: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe that is an opportunity to consolidate the class hierarchy a little?
@@ -86,7 +86,7 @@ def __init__(self, **kwargs): | |||
self.u_store = copy.deepcopy(self.u) | |||
# Set default options | |||
self.options = self.fmu.simulate_options() | |||
self.options['CVode_options']['rtol'] = 1e-6 | |||
# self.options['CVode_options']['rtol'] = 1e-6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed FMUs for "co-simulation" are now the standard for BOPTEST and I guess here too and indeed that means the solver is baked into the FMU.
I think this PR is key context.
@@ -150,74 +148,6 @@ def checkout_run(self, run_id: str) -> Run: | |||
run.dir = run_path | |||
return run | |||
|
|||
def add_site_to_mongo(self, haystack_json: dict, run: Run): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any data left being pushed into a Mongo document? I don't see it. Can we remove Mongo entirely?
@@ -0,0 +1,19 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I admit that I'm running out of puff, but although I'm may not be catching all of the nuance on these tests, I'm really glad they are here. :)
Finished
LOGLEVEL
environment variable determines which logs are seen via stdout.In Progress
log_subprocess
function added to job which will automatically run a subprocess, pushing its output to the logger.Todo
alfalfa_metadata
measure into CreateRun job usingopenstudio
module.StepRunBase
andStepRunProcess