DDEV Blog: Using WarpBuild to speed up DDEV in CI
For most developers, DDEV solves a common challenge: making sure that each developer has a consistent, stable local environment for building their web application. We had more and more success with DDEV at Lullabot, but another related issue kept coming up: how do we grow and develop our use of continuous integration and automated testing while avoiding the same challenges DDEV solved for us?
A typical CI/CD pipeline is implemented using the tools and systems provided by the CI service itself. For example, at a basic level you can place shell commands inside configuration files to run tests and tools. Running those commands locally in DDEV is possible, but it's a painful copy/paste process. If you're a back-end or DevOps engineer, odds are high you've wasted hours trying to figure out why a test you wrote locally isn't passing in CI – or vice versa!
As a first step, we used Task to improve our velocity. Having a unified task runner that works outside PHP lets us standardize CI tasks more easily. However, this still left a big surface area for differences between local and CI environments. For example, in GitHub, the shivammathur/setup-php action is used to install PHP and extensions, but the action is not identical to DDEV. Underlying system libraries and packages installed with apt-get could also be different, causing unexpected issues. Finally, there was often a lag in detecting when local test environments broke because those changes weren't tested in CI.
This brought us to using DDEV for CI. It's a great solution! Running all of our builds and tasks in CI solved nearly every "it works on my machine" problem we had. However, it introduced a new challenge: CI startup performance.
Unlike using a CI-provider's built-in tooling, DDEV is not typically cached or included in CI runners. Just running the setup-ddev action can take up to a minute on a bad day. That doesn't include any additional packages or Dockerfile customizations a project may include. At Lullabot, we use ddev-playwright to run end-to-end tests. Browser engines and their dependencies are heavy! System dependencies can be north of 1GB of compressed packages (that then have to be installed), and browsers themselves can be several hundred MB. This was adding several minutes of setup time just to run a single test.
Luckily, based on our experience building Tugboat, we knew that the technology to improve startup performance existed. When WarpBuild was announced with Snapshot support in 2024, we immediately started testing it out. We theorized that the performance improvement of snapshots would result in significant startup time improvement. Here's how we set it up!
We had three parallel jobs that all required DDEV:
- Playwright Functional Tests - these were using 8 "large" runners from GitHub to complete our test suite fast. Before WarpBuild, each runner took between 15 and 20 minutes to run tests.
- Static tests running PHPStan, PHPUnit, and so on.
- ZAP for security scanning.
Note that our Playwright tests themselves run in parallel on a single worker as well, using lullabot/playwright-drupal. This allows us to optimize the additional startup time for installing Drupal itself (which can't be cached in a snapshot) across many tests.
After linking WarpBuild to our GitHub repository, we had to update our workflows. For the full combined example, see the repository at ddev/ddev-ci-warpbuild-example.
Here is an example representing the changes we made to our workflow after enabling Snapshots in the WarpBuild UI. At a high level, here's the flow we want to create with our GitHub jobs:
flowchart TD A[determine-snapshot: <br>Hash key files] --> B[Request WarpBuild runner<br>with snapshot key] B --> C{Snapshot exists?} C -->|"Yes (fast path)"| D[Restore snapshot<br>DDEV pre-installed] C -->|"No (first run)"| E[Install DDEV, browsers,<br>and dependencies] D --> F[Start DDEV & run tests] E --> F F --> G{First run?} G -->|Yes| H[Clean up & save snapshot] G -->|No| I[Done!] H --> IStart with a basic workflow to trigger on pull requests and on merges to main.
name: "WarpBuild Snapshot Example" on: push: branches: [main] pull_request:Before running our real work, we need to know what snapshot we could restore from. We start by creating a hash of key files that affect what gets saved in the snapshot. For example, if Playwright (and its browser and system dependencies) are upgraded by Renovate, we want a new snapshot to be created. Extend or modify these files to match your own project setup.
jobs: determine-snapshot: # This could be a WarpBuild runner too! runs-on: ubuntu-24.04 outputs: snapshot: ${{ steps.snapshot-base.outputs.snapshot }} steps: - uses: actions/checkout@v6 - name: Determine Snapshot Base id: snapshot-base run: | set -x hash=$(cat .github/workflows/test.yml test/playwright/.yarnrc.yml test/playwright/yarn.lock | md5sum | cut -c 1-8) echo "snapshot=$hash" >> $GITHUB_OUTPUT shell: bashWarpBuild needs some additional configuration to tell GitHub Actions to use it as a runner. This could be as simple as runs-on: 'warp-<runner-type>' if you aren't using snapshots. WarpBuild has many runner options available, including ARM and spot instances to reduce costs further.
The runs-on statement:
- Skips snapshots via commit messages.
- Uses a "16x" sized runner so we can run tests in parallel.
- Creates a snapshot key with the project name, the ddev version, a manual version number, and the short hash of the files from above.
We also switch to the WarpBuild cache (so it's local to the runner) and check out the project. Update the cache paths as appropriate for your project.
jobs: # other jobs... build-and-test: needs: [determine-snapshot] runs-on: "${{ contains(github.event.head_commit.message, '[warp-no-snapshot]') && 'warp-ubuntu-2404-x64-16x' || format('warp-ubuntu-2404-x64-16x;snapshot.key=my-project-ddev-1.25.1-v1-{0}', needs.determine-snapshot.outputs.snapshot) }}" steps: - uses: WarpBuilds/cache@v1 with: path: | ${{ github.workspace }}/.ddev/.drainpipe-composer-cache ${{ github.workspace }}/vendor ${{ github.workspace }}/web/core ${{ github.workspace }}/web/modules/contrib key: ${{ runner.os }}-composer-full-${{ hashFiles('**/composer.lock') }} - uses: actions/checkout@v6We need to add logic to either start from scratch and install everything or restore from a snapshot. Since DDEV isn't installed by default in runners, we can use its presence to easily determine if we're running from inside a snapshot or not. We save these values for later use.
jobs: # other jobs... build-and-test: steps: # ... previous steps ... - name: Find ddev id: find-ddev run: | DDEV_PATH=$(which ddev) || DDEV_PATH='' echo "ddev-path=$DDEV_PATH" >> "$GITHUB_OUTPUT" if [ -n "$DDEV_PATH" ]; then echo "ddev found at: $DDEV_PATH (restored from snapshot)" else echo "ddev not found (fresh runner, will install)" fiIf ddev exists, we can skip installing it:
jobs: # other jobs... build-and-test: steps: # ... previous steps ... - name: Install ddev uses: ddev/github-action-setup-ddev@v1 if: ${{ steps.find-ddev.outputs.ddev-path != '/usr/bin/ddev' }} with: autostart: false # When updating this version, also update the snapshot key above version: 1.25.1At this point, we've got DDEV ready to go, so we can start it and run tests or anything else.
jobs: # other jobs... build-and-test: steps: # ... previous steps ... - name: Start ddev run: | # Playwright users may want to run `ddev install-playwright` here. ddev start ddev describe - name: Run tests run: | ddev exec echo "Running tests..." # Replace this with one or more test commands for your project. ddev task test:playwrightNow, tests have passed and we can create a snapshot if needed. If tests fail, we never create a snapshot so that we don't accidentally commit a broken environment.
We shut down DDEV since we're going to clean up generated files. This keeps our snapshot a bit smaller and gives us an opportunity to clean up any credentials that might be used as a part of the job. While we don't typically need a Pantheon token for tests, we do need it for some other jobs we run with DDEV.
jobs: # other jobs... build-and-test: steps: # ... previous steps ... - name: Clean up for snapshot if: ${{ steps.find-ddev.outputs.ddev-path != '/usr/bin/ddev' }} run: | # Stop ddev to ensure clean state ddev poweroff # Remove any cached credentials or tokens rm -f ~/.terminus/cache/session # Clean git state and temporary files git clean -ffdxNow we can actually save the snapshot. We skip this if we can since it takes a bit of time to save and upload. There's no point in rewriting our snapshot if it hasn't changed! The wait-timeout-minutes is set very high, but in practice this step only takes a minute or two. We just don't want this step to fail if Amazon is slow.
jobs: # other jobs... build-and-test: steps: # ... previous steps ... - name: Save WarpBuild snapshot uses: WarpBuilds/snapshot-save@v1 if: ${{ steps.find-ddev.outputs.ddev-path != '/usr/bin/ddev' }} # Using a matrix build? Avoid thrashing snapshots by only saving from one shard. # if: ${{ matrix.shard == 1 && steps.find-ddev.outputs.ddev-path != '/usr/bin/ddev'}} with: # Must match the snapshot.key in runs-on above alias: "my-project-ddev-1.25.1-v1-${{ needs.determine-snapshot.outputs.snapshot }}" fail-on-error: true wait-timeout-minutes: 30To test, once you have jobs passing, you can rerun them from the GitHub Actions UI. If everything is working, you will see all steps related to installing DDEV skipped.
Note: We don't pin actions to hashes in these examples for easy copypaste, but for security we always use Renovate to pin hashes for us. We would also like to use Renovate Custom Managers to automatically offer DDEV upgrades and keep the version number in sync across all files and locations.
The Results?- The time to start Playwright tests was reduced from 4 to 5 minutes to 1 to 2 minutes. Now, the longest time in the workflow is the ddev start command.
- This project uses eight parallel runners, so we're saving about 24 minutes of CI costs per commit.
- We thought costs would go down, but we ended up writing many more tests! CI costs with WarpBuild stayed roughly similar to our previous GitHub costs but with greater test coverage and faster reports.
- While ZAP tests needed browsers like Playwright, static tests didn't. However, restoring snapshots was fast enough creating separate snapshots without browsers wasn't worth the complexity.
- Snapshot storage costs are inexpensive enough to not matter compared to the CI runner cost.
While this seems like a lot of work, it was only about half a day to set up and test – and that was when WarpBuild was in beta, had minimal documentation and some rough edges. We haven't really had to touch this code since. Setting up new projects is an hour, at most.
Do you have other optimizations for DDEV in CI to share? Post in the comments, we'd love to hear them!
DrupalCon News & Updates: DrupalCon Chicago 2026: Must‑See Sessions for Seasoned Developers
Hey experienced developers! You know how to tame Drush, charm Composer, debug like a detective, juggle configs, and wrestle with tricky modules. But there’s an event that will max out your RAM with Drupal hacks, insights, and wisdom.
Chicago may be famous for deep-dish pizza, but this spring it’s serving up something even more satisfying: deep dives into Drupal. DrupalCon Chicago 2026 is the place for seasoned developers to sharpen their skills, swap stories, and maybe laugh at a few module mishaps along the way.
It’s a code playground with a side of professional growth — sessions designed to challenge, inspire, and connect. Ready to level up your craft and enjoy a few geeky chuckles? The program is packed with standout sessions, but here are a few you absolutely won’t want to miss.
Top DrupalCon Chicago 2026 Sessions for Experienced Developers “The state of JavaScript Code Components in Drupal Canvas” — by Bálint KlériDrupal Canvas, the new-generation page builder, offers multiple ways to create pages for different audiences. Non-tech users will enjoy intuitive drag-and-drop tools, ready-made components, and even building pages from a prompt to an AI agent. But what’s in it for developers? First of all, it’s Code Components.
JavaScript in Drupal keeps evolving, and Code Components in Drupal Canvas are the latest twist worth watching. First unveiled at DrupalCon Atlanta, they came with a zero‑setup, in-browser editor and instant support for React and Tailwind CSS.
Things have moved fast: data fetching and Next.js-style image optimization are now supported, and experiments with server-side rendering and third-party imports are in progress. And editing isn’t limited to the browser anymore — a new CLI lets you work with Code Components anywhere, opening doors to decoupled frontends and fresh workflows.
Catch Bálint Kléri (balintbrews), the technical lead for JavaScript components in Drupal, in his insightful session, where he will walk through what’s stable, what’s experimental, and what’s next. You’ll discover specific approaches and techniques for working with Code Components.
“AI Agents for Site Builders” — by Marcus JohanssonAI-driven automation is changing the ways organizations handle content personalization, workflows, customer support, and data insights. One of the most exciting tools to emerge from the Drupal AI initiative is AI agents — autonomous systems that carry out tasks, make decisions, and pursue goals on behalf of users.
You can learn more about Drupal’s new AI Agents framework from Marcus Johansson (marcus_johansson). On his drupal.org page, Marcus describes himself simply: “I tinker with AI.” But his “tinkering” is transforming Drupal from the ground up: Marcus leads the Drupal AI initiative in Drupal, shaping its architecture, driving its development roadmap, and steering the future of AI-powered tools in Drupal.
In this session, Marcus will show you how Drupal’s Agents framework lets you create business-specific agents without writing a single line of code. Instead of slogging through implementation details, you’ll see how prompt writing and communication skills can drive the interaction, while Drupal quietly handles the complexity behind the curtain.
Join Marcus as he unpacks what agents are, how the framework was built, and how it connects with the MCP (Model Context Protocol). For experienced developers, this session is a chance to explore a tool that cuts through the noise and unlocks fresh possibilities.
“Advanced Site Building with Drupal Canvas” — by Ted BowmanDrupal Canvas is gaining serious momentum, and it deserves a closer look from more than one angle. Alongside the earlier-mentioned session on Code Components, this one is a hands-on exploration of how Canvas works hand in hand with some of Drupal’s most powerful tools.
Ted Bowman (tedbow), a long-time Drupal contributor, will show how Drupal Canvas can be combined with core features like Views and popular contributed modules to build advanced setups — all without writing a single line of code.
You’ll see an exciting demo packed with practical examples: creating dynamic landing pages, formatting structured content with Canvas templates, linking field data to SDC (Single-Directory Components) and Code Component properties, building Views inside Canvas templates, and using template slots to give editors more control.
As Canvas continues to evolve, Ted will also spotlight the latest features and contributed modules that extend its capabilities even further.
“Next Generation ECA - Vision and Progress update” — by Jürgen HaasDrupal offers many ways of workflow automation. But having tasks quietly carried out in the background — triggered by events, checked against conditions, and completed through actions — is a special kind of magic.
Experienced developers may remember the Rules module that pioneered this idea. Today, its modern successor, the Event–Condition–Action (ECA) module, takes the concept further, reimagined for Drupal’s current ecosystem. With a no-code/low-code approach and graphical modeling tools like BPMN, ECA makes building workflows more intuitive and far less intimidating.
Despite the amazing graphical interface with diagrams for ECA workflows, ECA needed to become even more approachable, especially for people without prior Drupal experience. So after solid real-world use and plenty of feedback from the community, ECA is entering its next phase. In his session, ECA’s creator, Jürgen Haas (jurgenhaas), the creator of ECA, will share how things are going with the revamp.
By lowering the barrier for site builders and project managers, the evolving ECA creates more room for developers to extend, integrate, and scale automation. The refreshed interface makes workflows easier to work with, while the underlying architecture opens fresh opportunities for custom plugins, enterprise integrations, and performance tuning.
“A Taste of the Future: Site Templates and Recipes in Drupal” — by Jim BirchDrupal has always had a knack for ambitious site building, but the Recipes Initiative is cooking up something new. Instead of distributions that lacked flexibility, we now have lightweight, composable recipes and site templates that make functionality easier to share, remix, and extend.
For experienced developers, it’s about speeding up the boring parts so you can focus on the interesting ones. Default content APIs and config actions are steadily maturing, and the community is already serving up recipes that cut down setup time while keeping flexibility intact.
Step into this session led by Jim Birch (thejimbirch), a renowned Drupal core committer and initiative coordinator. He will walk you through the progress so far, highlight examples from the community, and demonstrate practical authoring workflows. You’ll leave with a clear sense of how recipes fit into Drupal’s future, how to find and apply them effectively, and how to contribute your own to the growing ecosystem.
“Beyond Iframes: Modern Embedding in Drupal with Media and oEmbed” — by Pedro CambraEmbedding external content in Drupal has become trickier as platforms tighten security rules and content policies. If iframes keep letting you down, this session offers a cleaner, more future-proof approach.
Join Pedro Cambra (pcambra), an experienced Drupal contributor, as he shares practical guidance on embedding external content with oEmbed. He explores how Drupal uses the oEmbed standard together with core Media tools and contributed modules like oEmbed Providers to embed third-party content safely and reliably. You’ll get a clear look at how oEmbed works behind the scenes, which modules fit best for different use cases, and how CKEditor handles embedded objects.
The session also touches on enhancing embeds with authentication or privacy controls and building your own oEmbed resources for custom content. Practical examples keep things grounded, with plenty of tips you can apply right away. If you’ve ever wrestled with embeds or want a more robust setup that plays nicely with modern platforms, this session is well worth your time.
“Future Proof Theming: Best Practices for Drupal’s New Era” — by Mike Herchel and Andy BlumTheming in Drupal is entering a new era, and this training is designed to keep even seasoned developers ahead of the curve. It will be led by Mike Herchel (mherchel) and Andy Blum (andy-blum), key contributors driving theming innovations in Drupal. The training session dives into Single Directory Components, Drupal Canvas, and modern CSS/JS techniques that will shape how we build themes going forward.
Through hands‑on exercises, you’ll learn to craft reusable components, streamline workflows with Storybook, and deliver designs that are fast, accessible, and maintainable.
You’ll pick up strategies to dodge common page‑builder pitfalls and keep your themes flexible for whatever comes next. If you’re ready to sharpen your skills and future‑proof your toolkit, this training belongs on your schedule. It must be noted that training sessions require an additional ticket.
Driesnote — by Dries BuytaertEvery DrupalCon has its traditions, and the Driesnote is one of the most anticipated. For developers who spend their days building and maintaining Drupal sites, the Driesnote is a chance to catch a glimpse of what’s emerging in the platform.
Beyond the usual updates, it’s the place to hear the newest features, initiatives, and announcements that set the stage for what’s coming next for Drupal. You’ll discover new tools, architectural changes, and exciting directions for core and contributed projects, all straight from Dries Buytaert. Right in the main auditorium, you’ll catch demos that haven’t been shown anywhere else yet.
This session is Drupal’s roadmap in real time. Experienced developers will walk away with inspiration for their own projects, and an insider’s view of upcoming improvements that could change how we work with the platform.
Final thoughtsWrapping up the developer sessions at DrupalCon Chicago, the vibe is clear: Drupal keeps giving us new tools, and it’s up to us to explore them, stress-test them, and help shape what comes next. Canvas, ECA V2, Recipes, Site Templates, and other cool innovations are all evolving fast, and the fun part is digging into the details to see how they really work.
For experienced developers, the focus shifts away from shiny demos to spotting patterns, catching edge cases, and laughing when the “easy” stuff turns into a rabbit hole. These sessions are a reminder that Drupal’s future is being built in real time — and that we still get to shape it through commits, patches, and the occasional late-night debugging marathon.
Authored By: Nadiia Nykolaichuk, DrupalCon Chicago 2026 Marketing & Outreach Committee Member
Talking Drupal: Talking Drupal #541 - Mautic
Today we are talking about Mautic, marketing automation, and its history with Drupal with guest Ruth Cheesley. We'll also cover Mautic ECA as our module of the week.
For show notes visit: https://www.talkingDrupal.com/541
Topics- What Is Mautic?
- Self-Hosting and Data Ownership
- Who Uses Mautic + Personalization
- Mautic's History with Drupal
- How Drupal Integrate Mautic
- Orchestration in Mautic
- Privacy & Compliance: GDPR Tools, Consent, and Do-Not-Contact Controls
- Hosting Options
- Advanced Segmentation
- Points-Based Lead Scoring
- Validating Segments
- Using Points to Boost
- Common Mautic Adoption Pitfalls
- Getting Support
- The Future with AI
- AI and Open Source Maintenance
- Mautic Sustainability & Fundraising
- How to Contribute
- Mautic
- Mautic Integration
- Advanced Mautic Integration
- Talking Drupal #343 - Marketing Automation with Mautic
- Managed hosting, 40% goes to the community
- Mautic/Drupal case study and presentation on that from our conference
- GDPR cleanup jobs to remove old data
- Anonymization tasks to comply with specific laws (eg CCPA)
- Anonymize IP setting
- Proposal to overhaul all things privacy and streamline experience for marketers - currently seeking funding, planning to ship in Mautic 9
- Mautic contribution docs
- Testing PRs: inlcuding local setup guide
- Low/no-code tasks board
- Thanks Dev
- Ecosystems
Ruth Cheesley - ruthcheesley.co.uk RCheesley
HostsNic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Catherine Tsiboukas - mindcraftgroup.com bletch
MOTW CorrespondentMartin Anderson-Clutz - mandclu.com mandclu
- Brief description:
- Have you ever wanted to integrate Mautic marketing automation into your Drupal website, using ECA? There's a module for that.
- Module name/project name:
- Brief history
- How old: created in Jun 2025 by Abhisek Mazumdar (abhisekmazumdar) of Dropsolid
- Versions available: 1.0.6 which works with Drupal 10 and 11
- Maintainership
- Actively maintained
- Documentation - detailed README
- Number of open issues: 1 open issues, which is not a bug
- Usage stats:
- 3 sites
- Module features and usage
- With the module installed, your ECA models can respond to Mautic webhooks, and can also make use of new actions to give you CRUD capabilities (Create, Read, Update, or Delete) for contacts and segments within ECA
- Mautic ECA declares the Mautic API module as a dependency, and you need to use it to set up an API connection, and to define any webhooks you want to use in your models
- It's worth noting that the maintainers of Mautic ECA also seem to be involved with a number of other modules in the Mautic API ecosystem, including Mautic Personalization, as well as Mautic Content Provider, which can expose Drupal content for use in Mautic, for example to include in emails
Mike Herchel's Blog: Fun, dancing, and contribution at Florida DrupalCamp (photos included)!
The Drop Times: Architecture Before Alchemy
Every technology cycle has its buzzwords. This one has AI stamped on every roadmap, budget request, and vendor pitch deck. As Nitish Chopra argues, the rush to “integrate AI” into content management systems feels less like strategy and more like panic. Organizations are bolting large language models onto legacy stacks without confronting a harder truth: most digital architectures were never designed to support structured, machine-readable intelligence.
The pattern is becoming predictable across the CMS landscape. Some teams chase quick wins through plugin overload and surface-level integrations. Others pay enterprise premiums for repackaged APIs marketed as innovation. A third group disappears into technical rabbit holes, overengineering AI experiments that never survive contact with production realities. In each case, the failure is architectural. AI is treated as a feature to be installed, not a capability that depends on disciplined data modeling, governance, and system design.
This is where the Drupal ecosystem enters the conversation. For years, Drupal’s insistence on entities, fields, taxonomies, and structured content was criticized as overly complex. Yet those very foundations align with what AI systems require: clean schemas, reusable content objects, and predictable relationships. What once felt rigid now looks intentional. What was labeled pedantic now resembles preparation.
For Drupal builders, agencies, and enterprise stakeholders, the question is not whether to integrate AI, but how to do so without abandoning architectural discipline. AI success in Drupal will not come from chasing wrappers around APIs or cosmetic chatbot add-ons. It will come from doubling down on structured content architecture, refining data governance, and designing composable systems that can support automation at scale. In that sense, the ecosystem’s competitive advantage is not novelty. It is discipline.
This week’s edition reflects on that tension between momentum and method before turning to the stories shaping the ecosystem.
With that, let's shift the spotlight to the important stories from last week.
INTERVIEWDISCOVER DRUPAL- Dries Buytaert Introduces Drupal Digests to Track Drupal Development
- Drupal AI Initiative Reports Four Weeks of High-Velocity Development
- Support Sought for Long-Time Drupal Contributor Mike Feranda After Surgical Complications
- Drupal Core AGENTS.md Proposal Triggers Broader Debate on AI Guardrails
- Is It Time to Add AGENTS.md to Drupal Core?
- Drupal Agency Mergers Accelerate as Consolidation Reshapes the Ecosystem
- Stanford WebCamp 2026 Opens Session Submissions for April Hybrid Event
- DrupalCamp Italy 2026 Opens Early Bird Tickets and Call for Papers
- Drupal Open Source Day Scheduled for 12 March 2026 in Groningen
- DrupalCon Chicago 2026: CWG Issues Safety Advisory Amid Immigration Operations
- Florida DrupalCamp Begins 20 February in Orlando with Canvas and AI in Focus
- DDEV v1.25.0 Introduces Modular Share Provider System with Free Cloudflare Tunnel Option
- DrupalFit to Sponsor and Exhibit at DrupalCon North America 2026 in Chicago
We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now. To get timely updates, follow us on LinkedIn, Twitter, Bluesky, and Facebook. You can also join us on Drupal Slack at #thedroptimes.
Thank you.
Alka Elizabeth
Sub-editor
The DropTimes
Jacob Rockowitz: Coding Drupal with AI
Introduction
There is a subtle bait-and-switch here: I am going to talk about my experience coding with AI in Python, but the lessons learned apply to Drupal and the broader challenges developers face when coding with AI.
Over the past few months, AIs have begun to understand and write code for Drupal, and I want to understand how AI can help me with my Drupal projects. I would be the first to say “Vibe Coding” sounds like something invented in a hipster cafe, but it is here to stay, just like the Frappuccino.
As a Drupal Developer who has written a lot of PHP code over the years, I welcome the opportunity to write less and think more. Call me old-fashioned, but I am a self-taught developer who learned by reading books, even though AI moves so fast that books on the topic are out of date within a year. I decided to look for a book to help with this journey.
A search for “Coding Drupal with AI” yields very few results, yet it is notable that a post titled “Claude Code meets Drupal” by Dries Buytaert, the creator of Drupal, appears on the first screen of results. Sometimes, when learning something new or facing a new challenge, I like to work around the challenge.
For example, when I first started learning Drupal 8 as an experienced Drupal 6/7 developer, I was stumped by Symfony and the OOP patterns being introduced into Drupal, so I spent a few weeks building a Symfony application and then dove deep into Drupal 8. So I decided to approach AI coding in Python because it is a popular programming language that I was curious to learn. I chose to read Coding with AI: Examples in Python by Jeremy Morgan because it focuses first on AI and secondarily on using Python.
My...Read More
UI Suite Initiative website: Live show - Display Builder demo on Talk On My Machine (TOMM)
The Drop Times: The Work Behind the Workflow: Stas Zhuk and the Future of DDEV
Pivale: Using Agentic AI Without Vibe Coding
ComputerMinds.co.uk: Drupal 11 Upgrade Gotchas - Slick Carousel
Even as a seasoned Drupal developer, when upgrading a Drupal 10.x site to Drupal 11.x you can still encounter a number of weird issues with some older legacy code on the site, which had previously (unbeknownst to you) relied on functionality that has now changed in some way, shape or form due to the upgrade to Drupal Core and its dependencies.
I've just gone through a long morning of debug hell with some JavaScript functionality on a client's site that has previously not had any issues throughout the last few Drupal upgrades. For context, this site was originally built as a Drupal 7 site (all those years ago!), has been entirely rebuilt and migrated to Drupal 9, upgraded to Drupal 10 and now is finally being upgraded to Drupal 11(.3).
The site has a fair number of different JavaScript powered Carousels on the site, and at the time of building the site originally, the 'in vogue' solution for responsive carousels was the excellent Slick Carousel (Github link). I won't go into too many details about this package here, but it's worked well and hasn't caused any issues over the years with previous Drupal upgrades.
The package is dependent on jQuery, and with the move to jQuery 4.x in Drupal core (an optional dependency!), this is where the problems started. Now, it's unfair to expect a package with the last official release of nearly 9 years ago to magically work with a newer version of jQuery which didn't exist at the time (the latest version in 2017 was 3.2 but the plugin was designed around jQuery 2.x), but with the aim of trying to keep within budget for this Drupal 11 upgrade, it was decided to not rewrite the entire functionality of the carousels on this site.
This would have involved having to implement an entirely new (ideally non jQuery dependent!) plugin such as Swiper or Glider and re-writing the DOM structure in all the Twig templates that contain the markup for each of the carousels, tweaking the styling for each carousel, and then re-writing the various JavaScript code to work with the new chosen plugin. If the site only had a simple type of carousel in a singular place, then swapping out would have been a suitable option but the carousels on the site in question are quite complex in some instances, so I decided to go with trying to make Slick work.
Even though there hasn't been an official release of the package since October 2017, there has been work in the master branch in the last few years to make the plugin work with more modern jQuery versions. The previous upgrade to use jQuery 3.x in Drupal core (versions 8.5 and above) didn't actually cause any noticeable issues with the Slick plugin (at least in our use case), but the version 4 upgrade as part of Drupal 11 finally caused us issues.
jQuery 4 has finally removed a number of deprecated APIs, one of which is jQuery.type, which is what Slick used internally in multiple parts of it's code.. without the function available anymore, of course, the JavaScript blows up! Luckily, there have been a number of commits to the master branch of Slick in the last few years and one in 2022, which fixed these deprecated calls, allowing it to work properly with more modern jQuery versions. The commit in question was designed to fix jQuery 3.x issues, but by swapping out deprecated API calls in time, it's enabled it to work (mostly) with 4.x as well.
So to get the master version in place instead of the latest official release (which is very old), I made the following change to our package.json (the project in question uses npm) in the list of dependencies, changing:
to (the latest commit hash in the master branch):
and then re-installed the projects JS dependencies to bring the new version in.
(For reference, we have some code that will take the version installed here into the node_modules folder and copy it into our sites custom theme directory in an appropriate folder, we then define this as a custom Drupal library and include only where needed.)
With the latest version in place, the JS error with the previously used deprecated API's was gone, yay! But now we had other issues to worry about.
The first thing I noticed, now that the JS error was gone, was that a carousel on the homepage looked to be styled incorrectly compared to the version in production. Closer inspection of the DOM revealed that, for some reason now (after not changing any of the invocation calls to the Slick plugin), the slides were now wrapped in two extra divs, instead of the slides themselves getting the slick-slide class (amongst others).
At this point in time, I just assumed this was just a newer behaviour of the updated code of the Slick plugin.. so I set about just making a few quick style changes to the CSS that we had previously not had to take into account of these extra wrapping divs. Later on, I discovered the real reason for these additional wrapping divs... keep reading to find out what it was.
Broken:
Fixed:
This solved this immediate problem, and then I went hunting for the other carousels on the site, which is where things got very interesting (and time-consuming!)
The next carousel's with an apparent problem were on the site's main product page, where one is used on the left hand side (displayed vertically) and acts as a navigation of thumbnails of the 'main' product image gallery that is displayed next to it. The left hand one appeared to be functioning mostly correctly in itself (with a small style issue), but clicking on it would not progress the main slideshow at all! With no JS errors in the console and nothing obviously wrong, cue the debugging rabbit hole....
I won't go into too much detail here of all the paths I went down whilst debugging this but needless to say it involved swapping out the usually minified version of the Slick plugin for the non minified version, using the excellent Chrome JS debugger and stepping through exactly what was going on when Slick was trying to set this carousel up and why it was behaving like it was.
After a while, I finally realised the issue was present when Slick had started initialising itself - after invoking the plugin in the site's code - but before finishing setup. During it's setup, something was going wrong internally with the reference to the slides, which meant they were not copied into the slick-track div and the previously mentioned (new) wrapping divs were there in the DOM, but with no classes on at all.
The JS debugging revealed that the number of $slides being returned from the following Slick code was actually zero!
This meant the rest of the code that would then do the copying of the slides into the slick-track div (amongst other setup procedures) was failing. But - how could this possibly be? because running some debug code before initialising Slick and checking the number of slides in the DOM was correct...
The devil is in the detail here, and it turns out that the children() selector here is no longer matching my slide container children. But if we didn't change anything about the code that invokes the carousel, why exactly is it broken?
The key lies in the (optional) slide parameter for Slick (which controls the element query to find the slide). The working vertical carousel (amongst others that were working) wasn't using the slide parameter (as the DOM structure for that carousel has the children directly below the slide container), but the broken carousel was using it (due to a specific reason which I won't get into too far into the exact details of, but it involves other markup in the DOM at that specific place for another purpose on the site, so hence needing to specify the selector).
It turns out that if you omit the slide parameter, internally it'll use > div as the selector for slide , and (obviously) if you specify a selector, it'll use that. Because the previous code we had invoking Slick had specified a custom selector, and now (as mentioned above) there were two extra wrapping divs in play, the Slick selector .children( _.options.slide + ':not(.slick-cloned)') was not matching anymore, as my targets for the slide are now inadvertent grandchildren! and after explicitly defining a selector that wasn't the default (> div), it no longer matched.
The real question?But why are there now two wrapping <div> elements around each slide where previously there were not?
This is the real question that needs answering, now that we understand why the selector wasn't working during the setup for the slides themselves.
By default, Slick has a default setting for the rows of a slideshow of 1 (unless overridden when invoking the plugin). There is internal code inside of Slick in a buildRows() function (called during initialisation of the carousel) that checks if the number of rows is > 0, and if so, it wraps the inner slides in these two divs!
Slick.prototype.buildRows = function() { var _ = this, a, b, c, newSlides, numOfSlides, originalSlides,slidesPerSection; newSlides = document.createDocumentFragment(); originalSlides = _.$slider.children(); if(_.options.rows > 0) { slidesPerSection = _.options.slidesPerRow * _.options.rows; numOfSlides = Math.ceil( originalSlides.length / slidesPerSection ); for(a = 0; a < numOfSlides; a++){ var slide = document.createElement('div'); for(b = 0; b < _.options.rows; b++) { var row = document.createElement('div'); for(c = 0; c < _.options.slidesPerRow; c++) { var target = (a * slidesPerSection + ((b * _.options.slidesPerRow) + c)); if (originalSlides.get(target)) { row.appendChild(originalSlides.get(target)); } } slide.appendChild(row); } newSlides.appendChild(slide); } _.$slider.empty().append(newSlides); _.$slider.children().children().children() .css({ 'width':(100 / _.options.slidesPerRow) + '%', 'display': 'inline-block' }); } };A quick check of setting rows to 0 in the carousel settings confirmed this was indeed the overall problem, and immediately, my carousels looked and behaved in the way they did before the update. The rows setting is designed for when putting Slick in a "grid mode" where you specify how many rows you want and also how many slides per row you want with the slidesPerRow parameter.
But why are most of the carousels now getting their slides wrapped in rows due to buildRows(), even if I haven't changed the rows parameter from its previous default of 1?
There is a slight confusion when looking at the documentation for the plugin, as it specifies the default value is 1, but also specifies "Setting this to more than 1 initializes grid mode. Use slidesPerRow to set how many slides should be in each row." But this is clearly not true, as we can see a commit that is present in the previous version we were using (1.8.1) had changed this check from if(_.options.rows > 1) to if(_.options.rows > 0) without having updated any of the documentation to say so.
... or is it? The final "gotcha" was that after comparing the minified JS provided in release 1.8.1 with the non-minified JS... the minified JS of 1.8.1 does indeed check if rows > 1, not rows > 0, but the non-minified code checks if rows > 0 - so they don't match, doh!
Très Bien Blog: Dig for gold in Drupal Contrib code
Dries Buytaert: A better way to follow Drupal development
I've been reading Drupal Core commits for more than 15 years. My workflow hasn't changed much over time. I subscribe to the Drupal Core commits RSS feed, and every morning, over coffee, I scan the new entries. For many of them, I click through to the issue on Drupal.org and read the summary and comments.
That workflow served me well for a long time. But when Drupal Starshot expanded my focus beyond Drupal Core to include Drupal CMS, Drupal Canvas, and the Drupal AI initiative, it became much harder to keep track of everything. All of this work happens in the open, but that doesn't make it easy to follow.
So I built a small tool I'm calling Drupal Digests. It watches the Drupal.org issue queues for Drupal Core, Drupal CMS, Drupal Canvas, and the Drupal AI initiative. When something noteworthy gets committed, it feeds the discussion and diff to AI, which writes me a summary: what changed, why it matters, and whether you need to do anything. You can see an example summary to get a feel for the format.
Each issue summary currently lives as its own Markdown file in a GitHub repository. Since I still like my morning coffee and RSS routine, I also generate RSS feeds that you can subscribe to in your favorite reader.
I built this to scratch my own itch, but realized it could help with something bigger. Staying informed is one of the hardest parts of contributing to a large Open Source project. These digests can help new contributors ramp up faster, help experienced module maintainers catch API changes, and make collaboration across the project easier.
I'm still tuning the prompts. Right now it costs me less than $2 a day in tokens, so I'm committed to running it for at least a year to see whether it's genuinely useful. If it proves valuable, I could imagine giving it a proper home, with search, filtering, and custom feeds.
For now, subscribe to a feed and tell me what you think.
The Drop Times: Florida DrupalCamp Begins 20 February in Orlando with Canvas and AI in Focus
Talking Drupal: TD Cafe #015 - Karen & Stephen - Non-Profit Summit at DrupalCon
Join Karen Horrocks and Stephen Musgrave as they introduce the upcoming non-profit summit at DrupalCon 2026 in Chicago. In this comprehensive fireside chat, they explore how AI can be integrated to serve a nonprofit's mission, plus the dos and don'ts of AI implementation. Hear insights from leading nonprofit professionals, learn about the variety of breakout sessions available, and discover the benefits of Kubernetes for maximizing ROI. Whether you're a developer, content editor, or a strategic planner, this session is crucial for understanding the future of nonprofit operations with cutting-edge technology.
For show notes visit: https://www.talkingDrupal.com/cafe015
Topics- Introduction
- Meet Karen & Stephen
- Karen's Journey to Nonprofit Work
- Deep Dive into Drupal and Nonprofit Websites
- Capella's Approach to Continuous Improvement
- Nonprofit Summit Overview
- Exploring Summit Themes: AI and Resiliency
- Digital Sovereignty and Ethical Considerations
- Additional Breakout Sessions and Topics
- Community Engagement and Registration Details
- Conclusion and Final Thoughts
Stephen (he/him) is a co-founder, partner and Lead Technologist at Capellic, an agency that build and maintains websites for non-profits. Stephen is bullish on keeping things simple – not simplistic. His goal is to maximize the return on investment and minimize the overhead in maintaining the stack for the long term.
Stephen has been working with the web for over 30 years. He was initially drawn to the magic of using code to create web art, added in his love for relational databases, and has spent his career building websites with an unwavering commitment to structured content.
When Stephen isn't at his desk, he's often running to and swimming in Barton Springs Pool, getting a bit too wound-up at Austin FC games, and playing Legos with his little one.
Karen HorrocksKaren (she/her, karen11 on drupal.org and Drupal Slack) is a Web and Database Developer for the Physicians Committee for Responsible Medicine, a nonprofit dedicated to saving and improving human and animal lives through plant-based diets and ethical and effective scientific research.
Karen began her career as a government contractor at NASA Goddard Space Flight Center developing websites to distribute satellite data to the public. She moved to the nonprofit world when the Physicians Committee, an organization that she supports and follows, posted a job opening for a web developer. She has worked at the Physicians Committee for over 10 years creating websites that provide our members with the information and tools to move to a plant-based diet.
Karen is a co-moderator of NTEN's Nonprofit Drupal Community. She spoke on a panel at the 2019 Nonprofit Summit at DrupalCon Seattle and is helping to organize the 2026 Nonprofit Summit at DrupalCon Chicago.
ResourcesNonprofit Summit Agenda: https://events.drupal.org/chicago2026/session/summit-non-profit-guests-must-pre-register Register for the Summit (within the DrupalCon workflow): https://events.drupal.org/chicago2026/registration Funding Open Source for Digital Sovereignty: https://dri.es/funding-open-source-for-digital-sovereignty NTEN's Drupal Community of Practice Zoom call (1p ET on third Thursday of the month except August and December): https://www.nten.org/drupal/notes Nonprofit Drupal Slack Channel: #nonprofits on Drupal Slack
GuestsKaren Horrocks - karen11 www.pcrm.org Stephen Musgrave - capellic capellic.com
Drupal AI Initiative: SearXNG - Privacy-First Web Search for Drupal AI Assistants
If you’ve been following the rapid rise of AI‑driven chatbots and ‘assistant‑as‑a‑service’ platforms, you know one of the biggest pain points is trustworthy, privacy‑preserving web search. AI assistants need access to current information to be useful, yet traditional search engines track every query, building detailed user profiles.
Enter SearXNG - an open‑source metasearch engine that aggregates results from dozens of public search back‑ends while never storing personal data. The new Drupal module lets any Drupal‑based AI assistant (ChatGPT, LLM‑powered bots, custom agents) invoke SearXNG directly from the Drupal site, bringing privacy‑first searching in‑process with your content.
What is SearXNG?SearXNG aggregates results from up to 247 search services without tracking or profiling users. Unlike Google, Bing or other mainstream search engines, SearXNG removes private data from search requests and doesn't forward anything from third-party services.
Think of it as a privacy-preserving intermediary: your query goes to SearXNG, which then queries multiple search engines on your behalf and aggregates the results, all while keeping your identity completely anonymous.
The Drupal SearXNG ModuleThe Drupal SearXNG module brings this privacy-focused search capability directly into the Drupal ecosystem. It connects Drupal with your preferred SearXNG server (local or remote), includes a demonstration block, and provides an additional submodule that integrates SearXNG with Drupal AI by offering an AI Agent Tool.
This integration is particularly powerful when combined with Drupal's growing AI ecosystem, including the AI module framework, AI Agents and AI Assistants API.
Key Benefits Privacy by designThe most compelling benefit is complete privacy protection. When your Drupal AI assistant uses SearXNG to search the web:
- No user tracking or profiling occurs
- Search queries aren't stored or analysed
- IP addresses remain private
- No targeted advertising based on searches
- Compliance with privacy regulations like GDPR
This makes it ideal for organisations in healthcare, government, education and any sector where data privacy is paramount.
Comprehensive search resultsBy aggregating results from up to 247 search services, SearXNG provides more diverse and comprehensive search results than relying on a single search engine. Your AI assistant gets a broader perspective, potentially finding information that might be missed by individual search engines.
Self-hosted controlOrganisations can run their own SearXNG instance, giving them complete control over:
- Which search engines to query
- Rate limiting and usage patterns
- Data residency requirements
- Custom configurations and preferences
- Complete audit trails
Getting started is remarkably straightforward thanks to SearXNG's official Docker image, which makes launching a local server as simple as running a single command. This means organisations can have their own private search instance running in minutes, without complex server configuration or dependencies.
Seamless AI integrationThe module's AI Agent Tool integration means that Drupal AI assistants can seamlessly incorporate web search into their workflows. Whether it's a chatbot helping users navigate your site or an AI assistant helping content creators research topics, web search becomes just another capability in the assistant's toolkit.
Practical Use Cases Internal knowledge assistantsImagine a corporate intranet where employees use an AI assistant to find both internal documentation and external resources. The assistant can search your internal Drupal content while using SearXNG to find external information, all while maintaining complete privacy about what employees are researching.
Privacy-conscious educational platformsUniversities and schools increasingly need to protect student privacy. A Drupal-powered learning management system with an AI tutor can use SearXNG to help students research topics without creating profiles of their academic interests and struggles.
Government and public sector portalsGovernment organisations can leverage AI assistants to help citizens find information and services. Using SearXNG ensures that citizen queries remain private and aren't used for commercial purposes.
The Future of Privacy-First AIThe SearXNG Drupal module represents an important step forward in building AI systems that respect user privacy. As AI assistants become more prevalent in web applications, the ability to access current information without compromising privacy will become increasingly valuable.
Drupal's AI framework supports over 48 AI platforms, providing flexibility in choosing AI providers. By combining this with privacy-respecting search through SearXNG, organisations can build powerful, intelligent applications that align with growing privacy expectations and regulations.
ConclusionPrivacy and powerful AI don't have to be mutually exclusive. The SearXNG Drupal module proves that organisations can build intelligent, helpful AI assistants that respect user privacy. Whether you're building internal tools, public-facing applications, or specialised platforms, this module provides a foundation for privacy-first AI that can search the web without compromising user trust.
As data privacy regulations continue to evolve and users become more aware of digital privacy issues, tools like the SearXNG module will become increasingly essential. By adopting privacy-first approaches now, organisations can build user trust while delivering the intelligent, helpful experiences that modern web applications demand.
Find out more and download on the dedicated SearXNG Drupal project page.
Smartbees: Search Engine Query Reporting
Capellic: Stephen appears on TD Cafe: What to expect at the Nonprofit Summit at DrupalCon Chicago
A Drupal Couple: The Blueprint for Affordable Drupal Projects
For years we have been talking about how Drupal got too expensive for the markets we used to serve. Regional clients, small and medium businesses in Latin America, Africa, Asia, anywhere where $100,000 websites are simply not a reality. We watched them go to WordPress. We watched them go to Wix. Not because Drupal was worse, but because the economics stopped working.
That conversation is changing.
Drupal CMS 2.0 landed in January 2026. And with it came a set of tools that, combined intelligently, make something possible that was not realistic before: an affordable, professional Drupal site delivered for $2,000, with margin, for markets that could not afford us before.
I want to show you the math. Not to sell you a fantasy, but because I did the exercise and the numbers work. And I am being conservative.
What changedThe real budget killer was always theming. Getting a site to look right, behave right, be maintainable, took serious senior hours. That is where budgets went.
Recipes pre-package common configurations so you are not starting from zero. Canvas lets clients and site builders assemble and manage pages visually once a developer sets up the component library.
Dripyard brings professional Drupal themes built specifically for Canvas (although works with Layout Builder, Paragraphs, etc), with excellent quality and accessibility, at around $500. While that seems expensive, the code quality, designs, and accessibility are top notch and will save at least 20 hours (and usually much more), which would easily eat up a small budget.
Three tools. One problem solved.
We proved the concept about a month ago with laollita.es, built in three days using Umami as a starting point. Umami as a version 0.5 of what a proper template should be. Drupal AI for translations, AI-assisted development for CSS and small components. Without formal templates. With proper ones, it gets faster.
The $2,000 blueprintScope first. Most small business sites are simple: services, about us, blog, team, contact. The moment you add custom modules or complex requirements, the budget goes up. This blueprint is for projects that accept that constraint.
Start with Drupal CMS and a Dripyard theme. Recipes handle the configuration. Add AI assistance, a paid plan with a capable model, Claude runs between $15 and $50 depending on usage. Let it help you move faster, but supervise everything. The moment you stop reviewing AI decisions is the moment quality starts leaking.
For hosting, go with a Drupal CMS-specific provider like Drupito, Drupal Forge, or Flexsite, around $20 to $50 per month. Six months included for your client is $300. Those same $300 could go toward a site template from the marketplace launching at DrupalCon Chicago in March 2026, compressing your development time further.
With a constrained scope, the right tools, and AI under supervision, ten hours of net work is realistic. At LATAM-viable rates, $30 per hour on the high side, that is $300 in labor.
The cost breakdown: $500 theme, $300 hosting or template, $300 labor, $50 AI tools. Total: $1,150. Add a $300 buffer and you are at $1,450. Charge $2,000. Your profit is $550, a 27.5% margin.
And I am being conservative. As you build experience with the theme, develop your own component library, and refine your tooling, the numbers improve. The first project teaches you. The third one pays better.
The $1,000 pathSmaller budget, smaller scope. Start with Byte or Haven, two Drupal CMS site templates on Drupal.org, or generate an HTML template with AI for around $50. A site template from the upcoming marketplace will run around $300.
The math: $300 starting point, $150 for three months of hosting, $200 incidentals. Cost: $450. Charge $1,000. Margin: 35%.
A $1,000 project is a few pages, clear scope, no special requirements. Both you and the client have to be honest about that upfront.
The real value for your clientWhen a client chooses Wix or WordPress to save money, they are choosing a ceiling. The day they need more, they are either rebuilding from scratch or paying for plugins and extras that someone still has to configure, maintain, and update every time the platform breaks something.
A client on Drupal CMS is on a platform that grows with them. The five-page site today can become a complex application tomorrow, on the same platform, without migrating. That is the conversation worth having. Not just what they get today, but what they will never have to undo.
The tools are thereThe market in Latin America, Africa, Asia, and similar regions was always there. We just did not have the tools to serve it profitably. Now we do.
Drupal CMS, Canvas, Recipes, Dripyard, Drupal CMS-specific hosting, AI assistance with human oversight. The toolkit exists. Get back on trail.
Author Carlos Ospina Abstract Drupal CMS 2.0, Canvas, Recipes, and Dripyard have changed the economics of Drupal for regional markets. This is the blueprint for building professional Drupal sites at $1,000 to $2,000 with real margin, for LATAM, Africa, Asia, and similar markets that Drupal could not serve before. Tags Drupal Drupal Planet drupal-cms canvas recipes dripyard affordable-development small-business latam africa asia blueprint Rating Select ratingGive The Blueprint for Affordable Drupal Projects 1/5Give The Blueprint for Affordable Drupal Projects 2/5Give The Blueprint for Affordable Drupal Projects 3/5Give The Blueprint for Affordable Drupal Projects 4/5Give The Blueprint for Affordable Drupal Projects 5/5Cancel rating No votes yet Leave this field blank Add new comment
DDEV Blog: DDEV February 2026: v1.25.0 Ships, 72% Market Share, and New Training Posts
DDEV v1.25.0 is here, and the community response has been strong. This month also brought three new training blog posts and a survey result that speaks for itself.
What's New- DDEV v1.25.0 Released → Improved Windows installer (no admin required), XHGui as default profiler, updated defaults (PHP 8.4, Node.js 24, MariaDB 11.8), faster snapshots with zstd compression, and experimental rootless container support. Read the release post↗
- New ddev share Provider System → Free Cloudflare Tunnel support, no login or token required. A modular provider system with hooks and CMS-specific configuration. Read more↗
- Mutagen in DDEV: Functionality, Issues, and Debugging → Based on the January training session, this post covers how Mutagen works, common issues, and the new ddev utility mutagen-diagnose command. Read more↗
- Xdebug in DDEV: Understanding and Troubleshooting Step Debugging → How the reverse connection model works, IDE setup for PhpStorm and VS Code, common issues, and the new ddev utility xdebug-diagnose command. Read more↗
The 2026 CraftQuest Community Survey↗ collected responses from 253 Craft CMS developers and found DDEV at 72% market share for local development environments. The report notes: "This near-standardization simplifies onboarding for newcomers, reduces support burden for plugin developers, and means the ecosystem can optimize tooling around a single local dev workflow."
Conference Time!I'll be at Florida Drupalcamp this week, and will speak on how to use git worktree to run multiple versions of the same site. I'd love to see you and sit down and hear your experience with DDEV and ways you think it could be better.
Then in March I'll be at DrupalCon Chicago and as usual will do lots of Birds-of-a-Feather sessions about DDEV and related topics. Catch me in the hall, or let's sit down and have a coffee.
Community Highlights- ddev-mngr → A Go-based command-line tool with an interactive terminal UI for managing multiple DDEV projects at once — start, stop, check status, and open URLs across projects. With this add-on Olivier Dobberkau inspired a new TUI approach for DDEV core as well! View on GitHub↗
- TYPO3 DDEV Agent Skill → Netresearch built an Agent Skill (compatible with Claude Code, Cursor, Windsurf, and GitHub Copilot) that automates DDEV environment setup for TYPO3 extension development, including multi-version testing environments for TYPO3 11.5, 12.4, and 13.4 LTS. View on GitHub↗
- Using Laravel Boost with DDEV → Russell Jones explains how to integrate Laravel Boost (an official MCP server) with DDEV, giving AI coding agents contextual access to routes, database schema, logs, and configuration. Read on Dev.to↗
- Laravel VS Code Extension v1.4.2 → Now includes Docker integration support and a fix for Pint functionality within DDEV environments. Read more↗
- Getting Started with DDEV for Drupal Development → Ivan Zugec at WebWash published a guide covering installation, daily commands, database import/export, Xdebug setup, and add-ons. Read on WebWash↗
- Environnement de développement WordPress avec DDEV → Stéphane Arrami shares a practical review of adopting DDEV for WordPress development, covering client projects, personal sites, and training (in French). Read more↗
"I was today years old when I found out that DDEV exists. Now I am busy migrating all projects to Docker containers." — @themuellerman.bsky.social↗
"ddev is the reason I don't throw my laptop out of the window during local setup wars. one command to run the stack and forget the rest. simple as that." — @OMascatinho on X↗
v1.25.0 Upgrade Notes and Known IssuesEvery major release brings some friction, and v1.25.0 is no exception. These will generally be solved in v1.25.1, which will be out soon. Here's what to watch for:
- deb.sury.org certificate expiration on v1.24.x → The GPG key for the PHP package repository expired on February 4, breaking ddev start for users still on v1.24.10 who needed to rebuild containers. We pushed updated images for v1.24.10, so you can either ddev poweroff && ddev utility download-images or just go ahead and upgrade to v1.25.0, which shipped with the updated key. Details↗
- MariaDB 11.8 client and SSL → DDEV v1.25.0 ships with MariaDB 11.8 client (required for Debian Trixie), which defaults to requiring SSL. This can break drush sql-cli and similar tools on MariaDB versions below 10.11. Workaround: add extra: "--skip-ssl" to your drush/drush.yml under command.sql.options, or upgrade your database to MariaDB 10.11+. Details↗
- MySQL collation issues → Importing databases can silently change collations, leading to "Illegal mix of collations" errors when joining imported tables with newly created ones. Separately, overriding MySQL server collation via .ddev/mysql/*.cnf doesn't work as expected. #8130↗ #8129↗
- Inter-container HTTP(S) communication → The ddev-router doesn't always update network aliases when projects start or stop, which can break container-to-container requests for *.ddev.site hostnames. Details↗
- Downgrading to v1.24.10 → If you need to go back to v1.24.10, you'll need to clean up ~/.ddev/traefik/config — leftover v1.25.0 Traefik configuration breaks the older version. Details↗
- Traefik debug logging noise → Enabling Traefik debug logging surfaces warning-level messages as "router configuration problems" during ddev start and ddev list, which looks alarming but is harmless. Details↗
- ddev npm and working_dir → ddev npm doesn't currently respect the working_dir web setting, a difference from v1.24.10. Details↗
As always, please open an issue↗ if you run into trouble — it helps us fix things faster. You're the reason DDEV works so well!
DDEV Training ContinuesJoin us for upcoming training sessions for contributors and users.
February 26, 2026 at 10:00 US ET / 16:00 CET — Git bisect for fun and profit Add to Google Calendar • Download .ics
March 26, 2026 at 10:00 US ET / 15:00 CET — Using git worktree with DDEV projects and with DDEV itself Add to Google Calendar • Download .ics
April 23, 2026 at 10:00 US ET / 16:00 CEST — Creating, maintaining and testing add-ons 2026-updated version of our popular add-on training. Previous session recording↗ Add to Google Calendar • Download .ics
Zoom Info: Link: Join Zoom Meeting Passcode: 12345
Sponsorship UpdateAfter the community rallied in January, sponsorship has held steady and ticked up slightly. Thank you!
Previous status (January 2026): ~$8,208/month (68% of goal)
February 2026: ~$8,422/month (70% of goal)
If DDEV has helped your team, now is the time to give back. Whether you're an individual developer, an agency, or an organization — your contribution makes a difference. → Become a sponsor↗
Contact us to discuss sponsorship options that work for your organization.
Stay in the Loop—Follow Us and Join the ConversationCompiled and edited with assistance from Claude Code.