Chat w/ Documents Plugin Development Updates

Just ran some tests and it works for everything I can determine that is needed.

feature/plugin-required-services-runtime

I am not merging it at this moment unless you feel you are good with it, then I can or if you wish to just grab the code and add it to your work that is fine as well.

hey @DJJones ,

no need to send the alembic migration script, I added them already

and

Auto start and stop services are also added to both app/main.py

what is our policy for contribution @davewaring , @DJJones ?

send pull request to main with assigned reviewer or I can merge it myself?

I’ll defer to whatever @DJJones prefers in this.

We are at the stage in this moment where there is no reason to slow others down, just follow solid practices in this with making sure the PR has any recent merges to the main. Example yesterday I did 2 merges in addition to the migration I did which I left hanging for you to determine what worked for you.

I did answer the email of this but wanted to make sure it was noticed when I seen it here as well.

1 Like

Thanks @DJJones and @beck appreciate you guys communicating here on the forum as you have here so everyone can be kept up to date.

Thanks,

Dave W.

there are some issues with your latest updates on main @DJJones .

It doesn’t Error loading route; Route not found: Dashboard

And about migration script, I found only one new script which was committed on August 14th (it doesn’t seem to be related to plugin service runtime data)

created a pull request @DJJones
pull request #43

and here is the link to google drive of braindrive logs and database
https://drive.google.com/drive/folders/1NFUEpoJ61ghWMsTfIAeZgMa31tN8e8mN?usp=sharing

This is a heads up, I have yet to find the pattern, but I have nailed down the error that is triggering.

2025-08-21T19:57:08.456319Z [info     ] Stopping all plugin services... [app.plugins.service_installler.start_stop_plugin_services]
2025-08-21T19:57:08.463324Z [info     ] Request received               [main] method=OPTIONS path=/api/v1/auth/login
2025-08-21T19:57:08.463324Z [info     ] Response sent                  [main] path=/api/v1/auth/login status_code=200
INFO:     127.0.0.1:52583 - "OPTIONS /api/v1/auth/login HTTP/1.1" 200 OK
2025-08-21T19:57:08.464324Z [info     ] No plugin services found in the database to stop. [app.plugins.service_installler.start_stop_plugin_services]
2025-08-21T19:57:08.465324Z [info     ] ✅ Database connection closed   [main]
ERROR:    Traceback (most recent call last):
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\asyncio\base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
asyncio.exceptions.CancelledError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\asyncio\runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\asyncio\runners.py", line 123, in run
    raise KeyboardInterrupt()
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\site-packages\starlette\routing.py", line 700, in lifespan
    await receive()
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\site-packages\uvicorn\lifespan\on.py", line 137, in receive
    return await self.receive_queue.get()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\asyncio\queues.py", line 158, in get
    await getter
asyncio.exceptions.CancelledError

2025-08-21T19:57:08.471957Z [info     ] Request received               [main] method=POST path=/api/v1/auth/login
Exception in thread Thread-5:
Traceback (most recent call last):
  File "C:\Users\david\miniconda3\envs\BrainDriveBek\Lib\site-packages\aiosqlite\core.py", line 107, in run
2025-08-21 15:57:10 [info     ] File watcher is temporarily disabled```


I was able to get the error in a terminal but not in vscode (yet)

I am making some assumptions the issue exists without the branch because this has never triggered before.  I am also going with it is in the service start up because you didn't experience it before hand.  

What I am not experiencing is the loading and sidebar menu issues but the same point that stops for you is the same point above, like I said.. making a few assumptions at this point.  I am going to continue forward on this.

Here is the recording from today’s Chat W/ Documents checkin call. We spent a chunk of the call discussing the issue referenced earlier in this thread. I’ve included an AI powered summary of the rest of the call below the video.

Questions, comments, concerns, and ideas welcome as always. Just hit the reply button.

Thanks
Dave W.

Recroding:

Summary:

Chat With Documents: Current Status & Next Steps

  1. Chat With Documents (CWD) Docker installation is working – It auto-starts/stops with backend server; ready for main merge.
  2. Waiting on David’s final verification – Ensuring nothing critical is broken before merging.
  3. Integration to Brain Drive models needed next – Instead of manually setting LLMs, it should use the Brain Drive default model.
  4. Chat history duplication issue – Chat history currently saved in both CWD and Brain Drive; should consolidate under Brain Drive only.
  5. spaCy layout used for doc extraction – Lives in a separate backend; CWD calls that API when needed.
  6. Plan to bundle installation process – End user installs plugin and both backends (CWD + doc processor) are started via script.

Settings Plugin Discussion

  1. Plugin needs a settings module – For users to update variables (embedding model, doc processor location) without terminal.
  2. Existing examples exist – Other settings plugins (Olama, theme) can be used as reference.
  3. Settings handled via JSON config in DB – No need to write new backend code; just UI to display/update settings.
  4. Plan to update documentation – Docs need to reflect settings plugin capabilities and remote-first architecture.
  5. David confirms plugin settings architecture is solid – Each plugin should eventually have its own settings module.

Embedding Model Configuration

  1. Currently configured via environment variable – User specifies Olama host via env file.
  2. Move toward in-app config – Users will eventually choose/change embedding model via UI settings.
  3. Defaults can be provided – Default doc extractor (spaCy) and embedding model can be set for user convenience.

Roadmap Summary

  1. Steps outlined:
  • Resolve main branch issue.
  • Integrate Brain Drive model support.
  • Merge chat history storage.
  • Build plugin settings module for configuration.
  1. Settings module to be reusable across plugins – Long-term infrastructure improvement.

Alembic Migration Issues

  1. Beck encountered Alembic head conflicts – Multiple heads found, had to merge manually.
  2. Still seeing dashboard not found error – Even after successful DB migration.
  3. David suspects old user data – Asks Beck to create a new user to test fresh install.
  4. New user creation fails – Confirms something deeper may be wrong.
  5. David to take over merging – Beck will submit PR; David will handle merge and further debugging.

Deployment Practices

  1. Biasing toward speed, not security (for now) – Since it’s internal use, quick iterations are prioritized.
  2. Logs and DB to be shared – Beck will provide logs and .db file to help with diagnosis.
  3. Confirmed OS: Windows 11 – May or may not be relevant.
  4. Issue may relate to DB migration not completing properly – Possibly during initial install.

Next Meeting Coordination

  1. Next check-in planned for Monday – Beck will be invited to verify if issues are resolved.
  2. Testing targeted for tomorrow – Pending David’s confirmation after debugging.

Chat With Documents: Broader Implications

  1. Plugin paves way for more integrations – With Docker and plugin settings, other services can now be added easily.
  2. Plugin pattern unlocks extensibility – Non-developers can configure without touching terminal.
  3. Same approach works for other plugins (e.g., series) – Unified plugin-setting model to be reused.

I found a few issues that stuck out really fast, so I am going to present them here to @beck since he has the issues that make causing this predictable.

The functions you were calling could be causing a race condition which would explain a few things, so I changed the functions that were being called to the wrappers. I have yet to be able to trigger the issues as of yet.

Short Version

Fix 1: Correct the Import Path

# Change line 8 from:
from backend.app.plugins.service_installler.start_stop_plugin_services import start_plugin_services, stop_plugin_services

# To:
from app.plugins.service_installler.start_stop_plugin_services import start_plugin_services_on_startup, stop_all_plugin_services_on_shutdown


Fix 2: Update Function Calls

# In startup_event() line 34:
await start_plugin_services_on_startup()

# In shutdown_event() line 44:
await stop_all_plugin_services_on_shutdown()

Long Version

from fastapi import FastAPI, Request
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import JSONResponse
from fastapi.exceptions import RequestValidationError
from app.api.v1.api import api_router
from app.core.config import settings
from app.routers.plugins import plugin_manager
from app.plugins.service_installler.start_stop_plugin_services import start_plugin_services_on_startup, stop_all_plugin_services_on_shutdown
import logging
import time
import structlog

app = FastAPI(title=settings.APP_NAME)

# Configure CORS using settings from environment
app.add_middleware(
    CORSMiddleware,
    allow_origins=settings.cors_origins_list,
    allow_credentials=settings.CORS_ALLOW_CREDENTIALS,
    allow_methods=settings.cors_methods_list,
    allow_headers=settings.cors_headers_list,
    expose_headers=settings.cors_expose_headers_list,
    max_age=settings.CORS_MAX_AGE,
)

# Add startup event to initialize settings
@app.on_event("startup")
async def startup_event():
    """Initialize required settings on application startup."""
    logger.info("Initializing application settings...")
    from app.init_settings import init_ollama_settings
    await init_ollama_settings()
    # Start plugin services
    await start_plugin_services_on_startup()()
    logger.info("Settings initialization completed")


# Add shutdown event to gracefully stop services
@app.on_event("shutdown")
async def shutdown_event():
    """Gracefully stop all plugin services on application shutdown."""
    logger.info("Shutting down application and stopping plugin services...")
    # Stop all plugin services gracefully
    await stop_all_plugin_services_on_shutdown()
    logger.info("Application shutdown completed.")


# Add middleware to log all requests
logger = structlog.get_logger()

@app.middleware("http")
async def log_requests(request: Request, call_next):
    """Log all incoming requests."""
    start_time = time.time()
    
    # Log the request with full details
    logger.info(
        "Request received",
        method=request.method,
        url=str(request.url),
        path=request.url.path,
        query_params=str(request.query_params),
        client=request.client.host if request.client else None,
        headers=dict(request.headers),
    )
    
    try:
        # Process the request
        response = await call_next(request)
        
        # Calculate processing time
        process_time = time.time() - start_time
        
        # Log the response with full details
        logger.info(
            "Request completed",
            method=request.method,
            url=str(request.url),
            path=request.url.path,
            status_code=response.status_code,
            process_time_ms=round(process_time * 1000, 2),
        )
        
        return response
    except Exception as e:
        # Log any exceptions
        logger.error(
            "Request failed",
            method=request.method,
            url=str(request.url),
            path=request.url.path,
            error=str(e),
            exception_type=type(e).__name__,
        )
        raise

# Add exception handler for validation errors
@app.exception_handler(RequestValidationError)
async def validation_exception_handler(request: Request, exc: RequestValidationError):
    """Handle validation errors."""
    import logging
    logger = logging.getLogger(__name__)
    logger.error(f"Validation error: {exc.errors()}")
    return JSONResponse(
        status_code=422,
        content={"detail": exc.errors()},
    )

# Include API routers
app.include_router(api_router)

@beck
I thought of a new way to test this and my solution above didn’t solve your issue. If anything it revealed a bit more info but didn’t solve. I need to get this issue to appear in vscode so I can breakpoint it.

thanks for correct import path and wrapper function.

but the functions are being called in backend/main.py (not backend/app/main.py)
I was also confused first, and initially tried adding runtime service auto starter in backend/app/main.py)

Here where start / stop runtime service is actually called:

I am also debugging the “Error loading navigation” issue, and switched back to commit before updating from main (git checkout d1420dd) and the issue is gone. So, confirming that this error happens somewhere after updating from main. Will update this thread once I found the cause.

@beck Yeah, I did something similar as well since the race condition seems to be in registration I did go back to the main to verify it was not there. (I did a billion registrations while I was hardening the CORS so I wanted to make sure this wasn’t present in that PR)

The one thing I have yet to see is your navigation issue. May want to look into the db and sure everything is there that should be since this could simply be an effect from the race condition in the registration that is causing not everything to be handled correctly.

@beck

I am not finished testing yet but did find a solution:

uvicorn main:app --host localhost --port 8005

When removing filewatch it took out the race conditions. This works for me in a terminal for testing and will move into vscode to continue testing. Figured another set of eyes is always good.

In the .env just set reload to false, so far in my limited testing this has worked

# Server Settings
HOST="0.0.0.0"
PORT=8005
RELOAD=false
LOG_LEVEL="info"

Hi @DJJones ,

I think I’ve found the issue, and I want to share the steps I took to isolate an Alembic migration problem and provide context, so we can clarify where the problem is coming from.

In short: the issue is with 219da9748f46_add_hierarchical_navigation_support.py migration, which causes a CircularDependencyError.


Background

I was working on my branch feature/plugin-services-runtime, which includes my plugin service runtime code and two migrations:

  • 4f726504a718_add_user_id_and_plugin_slug_to_plugin_.py
  • 64046e143e97_add_required_services_runtime_to_plugin_.py

I merged updates from main before creating a pull request, which brought in a new migration script:

  • 219da9748f46_add_hierarchical_navigation_support.py (created on 2025-08-14)

Then I created a new merge migration:

Revision ID: c3edaf85d73f
Revises: 219da9748f46, 4f726504a718
Create Date: 2025-08-21 17:38:06.586550
Path: backend/migrations/versions/c3edaf85d73f_merge_heads_for_hierarchical_nav_plugin_.py
Message: merge heads for hierarchical nav + plugin service runtime

After merging both (updates from git main and merge migration script), my app started failing, and I couldn’t access the dashboard.


Steps I took to isolate the issue

  1. Checked Alembic current state:

    alembic current
    

    Result: c3edaf85d73f (mergepoint) - as expected.

  2. Stamped Alembic back to a previous safe revision (before my migration scripts):

    alembic stamp cb95bbe8b720
    
    • This removed both my plugin service migrations and the hierarchical navigation migration from Alembic’s tracking.
    • Verified Alembic current: cb95bbe8b720.
  3. Switched Git to detached HEAD (before merging main):

    git checkout d1420dd
    
    • At this point, my plugin service code was intact.
    • The app worked correctly - all plugin services were functioning.
  4. Created a new local branch from d1420dd.

    • Deleted my two migration scripts.
    • Verified Alembic was still on cb95bbe8b720 (branchpoint).
    • Merged updates from main (which introduced 219da9748f46_add_hierarchical_navigation_support.py).
  5. Tested Alembic upgrade:

    alembic upgrade head
    

    Result:

    sqlalchemy.exc.CircularDependencyError: Circular dependency detected. 
    ('display_order', 'parent_id', 'is_expanded', 'is_collapsible')
    
    • This happens even on a clean SQLite database with just this migration applied.

Error Log (excerpt)

(BrainDriveDev) PS C:\Users\beckb\Documents\GitHub\brain_drive\BrainDrive\backend> alembic upgrade head
INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade cb95bbe8b720 -> 219da9748f46, add_hierarchical_navigation_support
Phase 1: Adding hierarchical columns to navigation_routes...
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Scripts\alembic.exe\__main__.py", line 7, in <module>
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\config.py", line 636, in main
    CommandLine(prog=prog).main(argv=argv)
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\config.py", line 626, in main
    self.run_cmd(cfg, options)
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\config.py", line 603, in run_cmd
    fn(
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\command.py", line 406, in upgrade
    script.run_env()
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\script\base.py", line 586, in run_env
    util.load_python_file(self.dir, "env.py")
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\util\pyfiles.py", line 95, in load_python_file
    module = load_module_py(module_id, path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\util\pyfiles.py", line 113, in load_module_py
    spec.loader.exec_module(module)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Users\beckb\Documents\GitHub\brain_drive\BrainDrive\backend\migrations\env.py", line 176, in <module>
    run_migrations_online()
  File "C:\Users\beckb\Documents\GitHub\brain_drive\BrainDrive\backend\migrations\env.py", line 151, in run_migrations_online
    context.run_migrations()
  File "<string>", line 8, in run_migrations
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\runtime\environment.py", line 946, in run_migrations
    self.get_context().run_migrations(**kw)
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\runtime\migration.py", line 623, in run_migrations
    step.migration_fn(**kw)
  File "C:\Users\beckb\Documents\GitHub\brain_drive\BrainDrive\backend\migrations\versions\219da9748f46_add_hierarchical_navigation_support.py", line 28, in upgrade
    with op.batch_alter_table('navigation_routes', schema=None) as batch_op:
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\contextlib.py", line 144, in __exit__
    next(self.gen)
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\operations\base.py", line 398, in batch_alter_table
    impl.flush()
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\operations\batch.py", line 164, in flush
    batch_impl._create(self.impl)
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\operations\batch.py", line 444, in _create
    self._transfer_elements_to_new_table()
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\operations\batch.py", line 330, in _transfer_elements_to_new_table
    self._adjust_self_columns_for_partial_reordering()
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\alembic\operations\batch.py", line 315, in _adjust_self_columns_for_partial_reordering
    sorted_ = list(
              ^^^^^
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\sqlalchemy\util\topological.py", line 73, in sort
    for set_ in sort_as_subsets(tuples, allitems):
  File "C:\Users\beckb\.conda\envs\BrainDriveDev\Lib\site-packages\sqlalchemy\util\topological.py", line 47, in sort_as_subsets
    raise CircularDependencyError(
sqlalchemy.exc.CircularDependencyError: Circular dependency detected. ('display_order', 'parent_id', 'is_expanded', 'is_collapsible')
(BrainDriveDev) PS C:\Users\beckb\Documents\GitHub\brain_drive\BrainDrive\backend>

Conclusion

The CircularDependencyError comes directly from the way
219da9748f46_add_hierarchical_navigation_support.py
tries to add multiple interdependent columns in a single batch_op on SQLite.

I have reproduced the issue with only this migration applied and no other changes, confirming it is not a local environment problem.

Additionally, I tested again by downloading a fresh ZIP of the main branch (which does not include my service plugin runtime code or migrations). I set it up in a new local project, ran the initial installation steps, and the same issue occurred (I could not access the dashboard).

Let me know if you need to see additional logs @DJJones

@beck
I couldn’t find any thing sticking out for the circular dependency but I did find a some areas I could clean up. I did a test run on a new db and no issues on my end.

"""add_hierarchical_navigation_support

Revision ID: 219da9748f46
Revises: cb95bbe8b720
Create Date: 2025-08-14 08:25:21.335545
"""
from typing import Sequence, Union

from alembic import op
import sqlalchemy as sa
from sqlalchemy import text

# revision identifiers, used by Alembic.
revision: str = "219da9748f46"
down_revision: Union[str, None] = "cb95bbe8b720"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
    """Add hierarchical navigation support (SQLite-safe)."""

    # ---- Phase 1: add columns ONLY (nullable first; no FK yet) ----
    # Using recreate='always' to avoid SQLite edge cases
    with op.batch_alter_table("navigation_routes", recreate="always") as batch_op:
        batch_op.add_column(sa.Column("parent_id", sa.String(32), nullable=True))
        batch_op.add_column(
            sa.Column(
                "display_order",
                sa.Integer(),
                nullable=True,  # temporarily nullable; we'll tighten after backfill
                server_default=sa.text("0"),
            )
        )
        batch_op.add_column(
            sa.Column(
                "is_collapsible",
                sa.Boolean(),
                nullable=True,  # temporarily nullable; we'll tighten after backfill
                server_default=sa.text("1"),
            )
        )
        batch_op.add_column(
            sa.Column(
                "is_expanded",
                sa.Boolean(),
                nullable=True,  # temporarily nullable; we'll tighten after backfill
                server_default=sa.text("1"),
            )
        )

    # ---- Phase 1b: backfill NULLs and enforce NOT NULLs ----
    conn = op.get_bind()
    conn.execute(
        text(
            """
            UPDATE navigation_routes
            SET display_order = COALESCE(display_order, 0),
                is_collapsible = COALESCE(is_collapsible, 1),
                is_expanded   = COALESCE(is_expanded, 1)
            """
        )
    )

    with op.batch_alter_table("navigation_routes", recreate="always") as batch_op:
        batch_op.alter_column("display_order", nullable=False)
        batch_op.alter_column("is_collapsible", nullable=False)
        batch_op.alter_column("is_expanded", nullable=False)

    # ---- Phase 2: create the self-referential FK in its own batch ----
    with op.batch_alter_table("navigation_routes", recreate="always") as batch_op:
        batch_op.create_foreign_key(
            "fk_navigation_routes_parent_id",
            "navigation_routes",
            ["parent_id"],
            ["id"],
            ondelete="CASCADE",
        )

    # ---- Phase 3: indexes ----
    op.create_index(
        "idx_navigation_routes_parent_id",
        "navigation_routes",
        ["parent_id"],
    )
    op.create_index(
        "idx_navigation_routes_display_order",
        "navigation_routes",
        ["display_order"],
    )
    op.create_index(
        "idx_navigation_routes_parent_order",
        "navigation_routes",
        ["parent_id", "display_order"],
    )

    # ---- Phase 4: seed / migrate data ----
    your_braindrive_id = "yourbraindriveparent123456789012"  # <= 32 chars
    your_pages_id = "yourpagesparent1234567890123456"        # <= 32 chars

    # Create "Your BrainDrive" parent (idempotent)
    conn.execute(
        text(
            """
            INSERT OR IGNORE INTO navigation_routes (
                id, name, route, icon, description,
                is_visible, creator_id, is_system_route,
                display_order, is_collapsible, is_expanded,
                created_at, updated_at
            ) VALUES (
                :id, 'Your BrainDrive', 'your-braindrive', 'AccountTree',
                'Core BrainDrive functionality and settings',
                1, 'system', 1,
                0, 1, 1,
                CURRENT_TIMESTAMP, CURRENT_TIMESTAMP
            )
            """
        ),
        {"id": your_braindrive_id},
    )

    # Create "Your Pages" parent (idempotent)
    conn.execute(
        text(
            """
            INSERT OR IGNORE INTO navigation_routes (
                id, name, route, icon, description,
                is_visible, creator_id, is_system_route,
                display_order, is_collapsible, is_expanded,
                created_at, updated_at
            ) VALUES (
                :id, 'Your Pages', 'your-pages', 'CollectionsBookmark',
                'Your custom pages and content',
                1, 'system', 1,
                1, 1, 1,
                CURRENT_TIMESTAMP, CURRENT_TIMESTAMP
            )
            """
        ),
        {"id": your_pages_id},
    )

    # Migrate existing system routes under "Your BrainDrive"
    route_migrations = [
        ("settings", 0, "Settings"),
        ("personas", 1, "Personas"),
        ("plugin-manager", 2, "Plugin Manager"),
        ("plugin-studio", 3, "Page Builder"),  # rename
    ]

    for route, order, display_name in route_migrations:
        result = conn.execute(
            text(
                """
                UPDATE navigation_routes
                SET parent_id = :parent_id,
                    display_order = :display_order,
                    name = :name
                WHERE route = :route AND is_system_route = 1
                """
            ),
            {
                "parent_id": your_braindrive_id,
                "display_order": order,
                "name": display_name,
                "route": route,
            },
        )
        # optional: prints if you want runtime feedback in logs
        # print(f"Migrated {route} -> {display_name}") if result.rowcount else print(f"Warning: {route} not found")

    # Ensure "Prompt Library" exists under "Your BrainDrive"
    conn.execute(
        text(
            """
            INSERT OR IGNORE INTO navigation_routes (
                id, name, route, icon, description,
                is_visible, creator_id, is_system_route,
                parent_id, display_order, is_collapsible, is_expanded,
                created_at, updated_at
            ) VALUES (
                'promptlibrary123456789012345678',
                'Prompt Library', 'prompt-library', 'LibraryBooks',
                'Manage your AI prompts and templates',
                1, 'system', 1,
                :parent_id, 4, 1, 1,
                CURRENT_TIMESTAMP, CURRENT_TIMESTAMP
            )
            """
        ),
        {"parent_id": your_braindrive_id},
    )

    # Bump display orders for any remaining root-level routes
    conn.execute(
        text(
            """
            UPDATE navigation_routes
            SET display_order = CASE
                WHEN route = 'your-braindrive' THEN 0
                WHEN route = 'your-pages' THEN 1
                ELSE COALESCE(display_order, 0) + 2
            END
            WHERE parent_id IS NULL
            """
        )
    )


def downgrade() -> None:
    """Remove hierarchical navigation support (SQLite-safe)."""

    conn = op.get_bind()

    # Move children to root level; we cannot recover prior ordering, so use current values
    conn.execute(
        text(
            """
            UPDATE navigation_routes
            SET parent_id = NULL,
                display_order = COALESCE(display_order, 0)
            WHERE parent_id IS NOT NULL
            """
        )
    )

    # Remove the parent routes we created
    conn.execute(
        text(
            """
            DELETE FROM navigation_routes
            WHERE route IN ('your-braindrive', 'your-pages')
              AND is_system_route = 1
            """
        )
    )

    # Drop indexes first (portable signature)
    op.drop_index("idx_navigation_routes_parent_order", table_name="navigation_routes")
    op.drop_index("idx_navigation_routes_display_order", table_name="navigation_routes")
    op.drop_index("idx_navigation_routes_parent_id", table_name="navigation_routes")

    # Drop FK and columns in a batch recreate
    with op.batch_alter_table("navigation_routes", recreate="always") as batch_op:
        batch_op.drop_constraint(
            "fk_navigation_routes_parent_id", type_="foreignkey"
        )
        batch_op.drop_column("is_expanded")
        batch_op.drop_column("is_collapsible")
        batch_op.drop_column("display_order")
        batch_op.drop_column("parent_id")

I am going to do a new install on bare system see if that exposes anything (This will be from the main repo and not my dev or with the above changes)

Just installed it from the main, no other branches (If I understood what you had said, you experienced the issue here as well)This is the migration section:

✅ Created empty SQLite DB at braindrive.db

Running Alembic migrations to initialize database...

INFO  [alembic.runtime.migration] Context impl SQLiteImpl.

INFO  [alembic.runtime.migration] Will assume non-transactional DDL.

INFO  [alembic.runtime.migration] Running upgrade  -> b059e76c411a, new_baseline_from_existing_db

INFO  [alembic.runtime.migration] Running upgrade b059e76c411a -> f0bc573ed538, standardize_uuid_format

INFO  [alembic.runtime.migration] Running upgrade f0bc573ed538 -> fcba819d216c, standardize_role_uuid_format

INFO  [alembic.runtime.migration] Running upgrade fcba819d216c -> 0a4f64f25f5c, standardize_tag_and_plugin_uuid_format

INFO  [alembic.runtime.migration] Running upgrade 0a4f64f25f5c -> 8f37e0df1aaa, add version field to users

INFO  [alembic.migration] Starting to add version field to users table

INFO  [alembic.migration] Successfully added version field to users table

INFO  [alembic.runtime.migration] Running upgrade 8f37e0df1aaa -> add_plugin_update_fields, Add plugin update tracking fields

INFO  [alembic.runtime.migration] Running upgrade 8f37e0df1aaa, add_plugin_update_fields -> c5130afa654b, Merge conflicting heads

INFO  [alembic.runtime.migration] Running upgrade c5130afa654b -> add_conversation_type_simple, add conversation_type field

INFO  [alembic.runtime.migration] Running upgrade 8f37e0df1aaa -> a1b2c3d4e5f7, add personas table

INFO  [alembic.migration] Starting to create personas table

INFO  [alembic.migration] Successfully created personas table

INFO  [alembic.runtime.migration] Running upgrade add_conversation_type_simple -> add_page_id_to_conversations, add page_id to conversations

INFO  [alembic.runtime.migration] Running upgrade add_conversation_type_simple -> restore_settings_tables, restore settings tables

settings_definitions table already exists, skipping creation

settings_instances table already exists, skipping creation

INFO  [alembic.runtime.migration] Running upgrade restore_settings_tables, add_page_id_to_conversations -> c7eed881f009, merge heads

INFO  [alembic.runtime.migration] Running upgrade c7eed881f009, a1b2c3d4e5f7 -> abb6734b0519, merge personas with existing heads

INFO  [alembic.runtime.migration] Running upgrade abb6734b0519 -> 49d16cc7f8f1, add_persona_id_to_conversations

INFO  [alembic.runtime.migration] Running upgrade 49d16cc7f8f1 -> 7d0185f79500, add_plugin_states_table

INFO  [alembic.runtime.migration] Running upgrade 7d0185f79500 -> cb95bbe8b720, mark_settings_tables_restored

✅ Verified settings_definitions table exists

✅ Verified settings_instances table exists

✅ Settings tables restoration check completed

INFO  [alembic.runtime.migration] Running upgrade cb95bbe8b720 -> 219da9748f46, add_hierarchical_navigation_support

Phase 1: Adding hierarchical columns to navigation_routes...

Phase 1 completed: Hierarchical columns added successfully

Phase 2: Creating parent routes and migrating data...

Warning: Route settings not found for migration

Warning: Route personas not found for migration

Warning: Route plugin-manager not found for migration

Warning: Route plugin-studio not found for migration

Phase 2 completed: Parent routes created and data migrated

Phase 3: Updating display orders...

Migration completed successfully!

do you have any idea, @DJJones ,why login (and refresh) is successful:

http://localhost:8005/api/v1/auth/login

{
    "access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIyYThkYzc5M2FlNzQ0YzZiOGI0ZTdiMWJhNGUzZjFmZCIsImV4cCI6MTc1NjEzMDY4MS4zNzM1MSwiaWF0IjoxNzU2MTI5NzgxLjM3MzUxfQ.kNsl-aoOFiWGIFZyn6rxPU4Zmb2xs6xG3gAQEleaY2k",
    "token_type": "bearer",
    "refresh_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIyYThkYzc5M2FlNzQ0YzZiOGI0ZTdiMWJhNGUzZjFmZCIsInJlZnJlc2giOnRydWUsImV4cCI6MTc1ODcyMTc4MS4zNzQ1MiwiaWF0IjoxNzU2MTI5NzgxLjM3NDUyfQ.r4HC4r6UmireUN62xBIVlL2C0WARNeQT9Cak2Xe1RSU",
    "expires_in": 900,
    "refresh_expires_in": 2592000,
    "issued_at": 1756129781,
    "user_id": "2a8dc793ae744c6b8b4e7b1ba4e3f1fd",
    "user": {
        "id": "2a8dc793ae744c6b8b4e7b1ba4e3f1fd",
        "username": "Beck",
        "email": "new@gmail.com",
        "full_name": null,
        "profile_picture": null,
        "is_active": true,
        "is_verified": false,
        "version": "0.4.5"
    }
}

but any other following requests (after successful login) it’s throws “401 Unauthorized” error
for example:
http://localhost:8005/api/v1/settings/instances?definition_id=theme_settings&user_id=2a8dc793ae744c6b8b4e7b1ba4e3f1fd&scope=user

{"detail":"Could not validate credentials: Signature has expired."}