-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ISSUE 2716] Create user tables for basic login.gov #2760
base: main
Are you sure you want to change the base?
Conversation
…ults (#2730) ## Summary Fixes #2729 ### Time to review: __3 mins__ ## Changes proposed Set `track_total_hits` to True when calling OpenSearch ## Context for reviewers While this field says it has possible performance cost due to needing to count all the records, we also request a count for various facet counts anyways, so I imagine this won't matter at all. ## Additional information https://opensearch.org/docs/latest/api-reference/search/ I loaded ~16k records into my local search index. Querying it with no filters returns this pagination info now: ```json { "order_by": "opportunity_id", "page_offset": 1, "page_size": 25, "sort_direction": "ascending", "total_pages": 676, "total_records": 16884 } ```
## Context This is currently failing a lot of CI builds
## Summary Fixes #2665 ### Time to review: __1 min__ ## Changes proposed > What was added, updated, or removed in this PR. Added `gh-transform-and-load` command to existing `make gh-data-export` command. I'm not sure if this is sufficient or correct, but I'm taking a guess based on what I see in #2546 and #2506. ## Context for reviewers > Testing instructions, background context, more in-depth details of the implementation, and anything else you'd like to call out or ask reviewers. Explain how the changes were verified. In the analytics work stream, we have a new CLI command `make gh-transform-and-load` for transforming and loading (some) GitHub data. Per issue #2665, that command should be run daily, after the existing `gh-data-export` command which exports data from Github. I see that `scheduled_jobs.tf` seems to be the mechanism by which `make gh-data-export` runs daily. In this PR I'm taking and educated guess and attempting to add `gh-transform-and-load` to the existing job, and requesting feedback from @coilysiren as to whether this is the correct approach. ## Additional information > Screenshots, GIF demos, code examples or output to help show the changes working as expected. Co-authored-by: kai [they] <[email protected]>
## Summary Fixes #2665 ### Time to review: __1 min__ ## Changes proposed > What was added, updated, or removed in this PR. Added scheduled job to run `make init-db` ## Context for reviewers > Testing instructions, background context, more in-depth details of the implementation, and anything else you'd like to call out or ask reviewers. Explain how the changes were verified. The GitHub data export, transform, and load job (see #2759) depends on a certain schema existing in Postgres. This PR creates a job to ensure the schema exists. ## Additional information > Screenshots, GIF demos, code examples or output to help show the changes working as expected.
### Time to review: __1 mins__ ## Context for reviewers Platform's assertion is this: whenever a deploy fails for any reason, it cancels the deploy, which locks the other 3 jobs. Those 3 jobs remain locked indefinitely. On the next deploy, every job but 1 is locked, but the other 3 jobs fail because they were locked prior, which causes 1 first job to be canceled, and thusly all 4 jobs are locked. It's an avalanche effect. Whenever 1 deploy fails, all 4 fail that point onwards.
first_name: Mapped[str] | ||
|
||
last_name: Mapped[str] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NOTE: regardless of anything else, we should hold on merging this until I've gotten more clarification. We might not have first/last name as those require ID proofing and I don't know if we intend for users to be ID proofed 100% of the time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Converted this PR to a Draft PR so we don't accidentally merge
…er-grants-gov into 2716/basic-user-tables
Summary
Fixes #{2716}
Time to review: 10 mins
Changes proposed
3 user tables
migration script
updated factories to create new users
Context for reviewers
Users tables will be used for Oauth2
Additional information