Skip to content
This repository has been archived by the owner on Jan 6, 2023. It is now read-only.

Database Vendor Support: PostgreSQL #369

Closed
benhaynes opened this issue Sep 7, 2018 · 26 comments
Closed

Database Vendor Support: PostgreSQL #369

benhaynes opened this issue Sep 7, 2018 · 26 comments
Labels
enhancement New feature or request pinned

Comments

@benhaynes
Copy link
Member

🚨 MIGRATED FROM REQUESTS.GETDIRECTUS.COM 🚨

Database Support - PostgreSQL DB

πŸ‘ = 14

Created 1 year ago by @ricricucit


@rijkvanzanten – (11 months ago)

It would be great to have some alternatives to just MySQL!

@benhaynes benhaynes added the enhancement New feature or request label Sep 7, 2018
@benhaynes benhaynes changed the title Database Support - PostgreSQL DB SQL Vendor Support: PostgreSQL Oct 23, 2018
@benhaynes benhaynes changed the title SQL Vendor Support: PostgreSQL Database Vendor Support: PostgreSQL Oct 23, 2018
@stale
Copy link

stale bot commented Dec 22, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Dec 22, 2018
@benhaynes benhaynes removed the stale label Dec 26, 2018
@stale
Copy link

stale bot commented Feb 24, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Feb 24, 2019
@benhaynes benhaynes added pinned and removed stale labels Feb 25, 2019
@benhaynes
Copy link
Member Author

benhaynes commented Mar 18, 2019

From @puco β€” directus/directus#1020 (comment)

Guys, what is the status of this? It is hanging on the top as the most requested feature after all.


So to give an update here, this is indeed the (second) most requested feature (behind GraphQL support, which is almost ready for beta). The reality is: we're a small open-source team and have to triage tickets based on priority. As of now that means that we are focusing on:

  1. Stability (closing all App/API bugs)
  2. Documentation and Commenting (to help ya'll out)
  3. Key component Refactors and optimizations (to keep things moving forward)
  4. Priority Feature Requests (this is where Postgres support exists)

Essentially, we're trying to limit the breadth of what we need to test/support until things are in a good place. Adding Postgres will be awesome, and we're moving in the direction by continuing to identify, document, and abstract any MySQL specific queries/features. That way we'll be able to hit the ground running when the time is right to add this support.

It's looking like we'll be starting in on this in Q3-2019 unless there are other priority issues that arise (pushes it back) or a company "sponsors" it's development (pushes it sooner). In the meantime, if you want to help out, try contributing to our public Doc where we're consolidating info on what needs to happen to fully abstract our database layer, become SQL vendor agnostic, and support Postgres:

https://docs.google.com/document/d/17ZUuUxb2qMotYduTzLuntvdj45ogfvU-d_RsDMMEhOA/edit?usp=sharing

@benhaynes
Copy link
Member Author

To achieve better clarity/visibility, we are now tracking feature requests within the Feature Request project board.

This issue being closed does not mean it's not being considered.

@matleh
Copy link

matleh commented Aug 20, 2019

What's the current state of PostgreSQL support for Directus? The release board for v2.2.0 says it's DONE but the code seems to only be merged to the pgsql branch.

Is this actually done and if so, how can I use it - can't find any documentation. (https://docs.directus.io/guides/cli.html#help for example still says only mysql supported).

@benhaynes
Copy link
Member Author

I thought our PostgreSQL PR was merged in. @directus/api-team ?

@rijkvanzanten
Copy link
Member

We're collecting community feedback on the postgres implementation before releasing it to the public

@hemratna
Copy link
Contributor

@benhaynes

  1. We will test https://github.com/directus/api/tree/pgsql and if everything looks fine then we will merge to develop branch.
  2. As we will be officially supporting PostgreSQL, I am suggesting can we have one hosted demo just like Directus App?
    This will use in for a quick check on any upcoming issues related only to PostgreSQL.
  3. We need to prepare the demo SQL file for PostgreSQL similar to MySQL. (Thinking to generate from current MySQL, So it will be easy to maintain.)

Until we have automation testing in place, we have to check the stability of the PostgreSQL time by time manually (Mostly when we send any DB related fixes).
I am requesting @shealavington, Please make sure the test case you are writing that also works for PostgreSQL. So in the future, we have automation testing for PostgreSQL too.

@urvashithakar @BJGajjar
Please plan to test the PostgreSQL on your local computer with all the test case mentation in #1099.

@matleh
Copy link

matleh commented Aug 22, 2019

Are there any plans on when PostgreSQL support will be available for the general public - any estimated time frame?

@rijkvanzanten
Copy link
Member

@matleh like @BJGajjar mentioned above, it's going to be very hard to know if it's stable enough to be "production ready" as it's going to rely on manually testing some use cases. We might merge it in to master but not "officially support"* it for a little while.

* Officially supporting something basically means pointing people to it in the docs. We'll add a page in there saying that postgres support is experimental

@shealavington
Copy link

shealavington commented Sep 2, 2019

I've just seen your comment @hemratna

Please make sure the test case you are writing that also works for PostgreSQL. So in the future, we have automation testing for PostgreSQL too.

As far as I'm aware, my test suite will be a simple Input -> Output test set. These tests will not be reflected on which database is used, the tests will be checking that the API can be used as expected, based on the documented endpoints, and you shouldn't have to change which requests you make based on which database is used.

Ultimately, as long as the API works as expected, tests should pass, and I shouldn't have to write any separate tests for different databases.

@rijkvanzanten
Copy link
Member

You're right, those tests should work regardless of database / tech stack used in the API

@benhaynes
Copy link
Member Author

Maybe the tests could add a test table to the database with known data before running, and remove after? That way it works on any database, but still has the info it needs to perform the tests?

@rijkvanzanten
Copy link
Member

It definitely has to do that, as we should assume an empty installation of directus for these tests (ping @shealavington )

@shealavington
Copy link

shealavington commented Sep 3, 2019

Maybe the tests could add a test table to the database with known data before running

I could specifically write the test setup and tear-downs for MySQL, so then I could use a node MySQL implementation to do the following, however then we require anyone running the tests to be specifically using MySQL:

  • Before Test: Create table schema
  • Test: Ensure items can be added through API
  • After Test: Drop table

Either that, or I was going to completely rely on the API to do it all, however then if the before or after fails then the test fails as it was incorrectly configured, but this allows us to run it against any instance.

  • Before Test: Create collection
  • Test: Ensure items can be added through API
  • After Test: Remove collection

But yea, all-in-all, it's nothing to do with if you're using MySQL, MariaDB, Postgres, we're simply seeing if the API is performing as we expect it to.

@benhaynes
Copy link
Member Author

That makes sense. We'll definitely want to make sure that this is database vendor agnostic. One more option to add into the mix:

If we want to ensure the setup/teardown works every time (doesn't rely on the API), then we could also have specific commands for each vendor. Since this "dummy" data shouldn't really change often, that shouldn't be a crazy amount of work.

Question 1: Can we keep the dummy data as one pure SQL file? Or does it need to be individual commands to work properly?

Question 2: Would we have setup/teardown before/after each individual test, or before/after all tests?

@shealavington
Copy link

shealavington commented Sep 4, 2019

I really wanted to K.I.S.S, therefore adding vendor specific commands creates more complexity. Ideally we could use the API for the entire thing, therefore if you make a change and run the tests, and the tests fail anywhere, then there's likely an issue with the code, even if it was in the setup(before) or teardown(after).

Q1: I believe we could have one SQL file that we use every time, we could run it before all tests run, and drop it after all tests have ran. Though, we need to ensure that when someone adds a new test, we have any seed data needed for the test, and we'd need it to work for each vendor requiring someone to test the tests for each vendor.

Q2: The original thought was before/after each suite. For example I want to test that the API can delete items, I'll make a suite of tests to delete 1 item, and 4 items, but, before they run, I'd need to insert 5 items for the tests to use.

Maybe we just write all tests in a format that allow them to take care of themselves?
So I would create a suite of tests called Items, and that suite would

  • Create the items it needs - if it fails then the test fails here
  • Read the items it needs - if it fails then the test fails here
  • Update the items it needs - if it fails then the test fails here
  • Delete the items it needs - if it fails then the test fails here

@benhaynes
Copy link
Member Author

Originally I was thinking we could use the Demo SQL file for tests, but I think we should be able to change that whenever we want, and shouldn't worry about breaking the tests.

Ooooh, I like the CRUD tests... testing themselves. If we use the API to create collections/fields/relationships, create items, then read them, update it, revert them, check revisions, delete, etc... then we can just catch and report whenever/wherever things break. Good thinking Shea! Then we don't even need a "seed schema".

Maybe the dummy data could even be language agnostic. So a lot of numbers, Lorem ipsum, and/or using "test names" as the data itself.

Also I just realized all this useful info is on the PostgreSQL issue... whoops!

@puco
Copy link

puco commented Sep 6, 2019

I tried to test the code as well and can't seem to connect. Can someone post how should the config file for postgres look like? Or how to turn on better logging (the only thing I see in the logs currently is quite unspecific api[_].ERROR: Directus\Database\Exception\ConnectionFailedException: Failed to connect to the database in /var/www/src/core/Directus/Application/CoreServicesProvider.php:776 (running with docker compose built from pgsql branch)

@puco
Copy link

puco commented Sep 7, 2019

Re connection: found the issue. Needed to install pdo_pgsql (the error reporting was not very helpfull as it does not log the original exception being thrown in Zend Adapter). After that managed to connect. Needed to refactor the schema.sql for pgsql. After adjusting the data types it starts and I could login. But I can't create any new collections (fails on creating id column, which I guess is created as int(11) which pgsql does not know what to do with). I also can't manage my existing tables (I can add the table but can't manage any fields due to some app issue)

@benhaynes
Copy link
Member Author

Hey @puco β€” I assume you're testing with the pgsql branch?

@puco
Copy link

puco commented Sep 10, 2019

@benhaynes yes I am. I can provide you more feedback as I test along if you are interested. But I don't think this is the right place for it.

@benhaynes
Copy link
Member Author

Thank you! This is probably the best place for it, for now. Any feedback is very much appreciated!

@Toilal
Copy link

Toilal commented Sep 25, 2019

pgsql branch is now 198 commits late from develop branch, can I open a new pull request with this branch rebased on develop for this to be merged in develop ?

I'll give a try on this branch because I really need PostgreSQL for one of my project.

@benhaynes
Copy link
Member Author

Yeah, let's get this up-to-date and tested... then we can finally look into it getting merged. The problem in the past has been the added time investment and complexity of maintaining a new database vendor, so the more help we get from the community the better.

@Toilal
Copy link

Toilal commented Sep 26, 2019

#1319 PR is initiated, i'll try to test on top of this branch.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request pinned
Projects
None yet
Development

No branches or pull requests

7 participants