Evolving Web: Designing a digital archive in partnership with an Indigenous community
Lessons for building a digital repository of archival material, stories, or user-generated knowledge.
Digital archives play an increasingly important role in preserving cultural knowledge, personal histories, and community memory. But not all archives are created equal. Beyond simply storing information, the most effective digital archives are designed to be welcoming, respectful, and alive — spaces that invite exploration while honouring the people and knowledge they represent.
At Evolving Web, we recently collaborated with the University of Denver on the Our Stories, Our Medicine Archive (OSOMA), a community-owned digital archive that centres traditional Indigenous knowledge related to health, wellness, culture, and identity. Built in close collaboration with community partners, OSOMA offers a powerful example of how digital repositories can move beyond institutional models toward something more participatory and human.
If you’re working on a digital archive — whether it’s focused on cultural heritage, community storytelling, or user-generated knowledge — here are some key lessons from OSOMA that can help guide your approach.
Design for discoverability, not just storageA strong digital archive doesn’t assume users know exactly what they’re looking for. Instead, it supports exploration and discovery.
On OSOMA, visitors can browse content by broad themes such as Plants, Food, Ceremony, Identity, and Land. From there, they can narrow their focus using more specific filters, for example, exploring knowledge connected to particular healing practices or types of medicine.
This structure allows users to move easily between big ideas and specific stories. Someone might begin by browsing “Plant Medicine” and then discover individual narratives, videos, or related knowledge shared by community members. The archive encourages curiosity rather than forcing users into rigid pathways.
By organizing content around themes that reflect Indigenous worldviews, rather than academic or institutional categories. OSOMA makes it easier for users to find meaning, not just information.
Use plain language to build trust
Plain language plays an important role in making digital archives accessible, but it also shapes how users feel when they engage with the content.
Across OSOMA, headlines, descriptions, and navigation labels are written in clear, approachable language. The content doesn’t feel instructional or authoritative, and it avoids positioning itself as a definitive source of medical advice. Instead, it presents stories, experiences, and teachings in a way that feels open-ended and respectful.
This tone is especially important for an archive focused on health and wellness. By avoiding prescriptive language, OSOMA creates space for users to learn without pressure, and reinforces that the knowledge being shared belongs to the community, not the platform.
Make it easy to access knowledge quicklyOSOMA includes rich media such as videos and interviews, and the way users access that content is intentional.
For example, users can watch videos directly from search and results pages, without needing to click through multiple screens. This makes it easier to sample content, follow related threads, and continue exploring without losing context.
These small experience details matter. They reduce friction and make the archive feel responsive and intuitive, especially for users who may be less comfortable navigating complex digital interfaces.
Focus on personal stories over institutionsMany digital archives unintentionally feel institutional, even when they contain deeply personal material. OSOMA takes a different approach by placing individual voices front and centre.
Each community member has a dedicated profile page that brings together their stories, interviews, and related knowledge items. These profiles help users understand who is sharing the knowledge, where it comes from, and how it connects to lived experience.
Stories aren’t treated as supplementary content, they are the foundation of the archive. This storytelling-first approach reflects Indigenous knowledge traditions, where stories are a primary way of sharing history, values, and healing practices. The result is an archive that feels human and relational, rather than abstract or academic.
An OSOMA community member profile brings individual voices to the forefront, weaving together personal stories, interviews, and related knowledge to show how lived experience anchors the archive.Make participation visible and welcoming
OSOMA was designed as a living, community-owned archive, and that intention is visible throughout the site.
Links and prompts to contribute are displayed prominently, making it clear that community members are invited to share their own stories and knowledge. Even visitors who never log in or submit content can immediately sense that OSOMA is shaped by ongoing participation.
Behind the scenes, the platform supports this model by allowing Indigenous users to log in, contribute content, and access protected cultural knowledge. Using Drupal’s Group functionality, the site ensures that sensitive information remains visible only to appropriate community members.
Participation isn’t treated as an add-on but rather it’s built into the structure of the archive itself.
OSOMA invites community members to contribute their own stories, ensuring the archive continues to grow through shared knowledge, relationships, and lived experience.Use design to support confidence and cohesion
Strong visual design helps establish trust, especially when an archive contains many voices and content types.
OSOMA uses photography and video of people, land, and cultural assets to ground the experience in real places and lived relationships. Circular image frames and a consistent colour palette draw from OSOMA’s visual identity and help tie together diverse content.
These design choices do important work quietly. They lend confidence to the stories being shared and ensure the site feels cohesive, even as new contributions are added over time. Rather than competing with the content, the design supports it, creating space for stories to speak for themselves.
Accessibility is foundational, not optionalOSOMA was built to be welcoming to a wide range of users, including Elders, youth, and non-specialist visitors.
The site meets WCAG AA accessibility standards, with clear layouts, strong colour contrast, and plain-language content. Navigation and browsing tools were designed to be intuitive, so users can explore without needing technical expertise.
Accessibility here isn’t treated as a compliance exercise. It’s part of a broader commitment to inclusion, respect, and ease of use: values that align closely with OSOMA’s community-led goals.
Building archives that honour living knowledgeOSOMA demonstrates that digital archives don’t have to replicate colonial or extractive models of knowledge storage. With the right approach, they can become spaces of connection, care, and continuity.
By prioritizing discoverability, plain language, personal storytelling, participation, strong design, and accessibility, OSOMA offers a powerful example of what’s possible when technology is shaped by community values.
If you’re thinking about building a digital archive or knowledge platform, this project is a reminder to look beyond the technical requirements and ask deeper questions about ownership, voice, and experience.
Get in touch to talk about building digital platforms that are inclusive, future-friendly, and people-first.
Learn more about the OSOMA project by reading the case study.
+ more awesome articles by Evolving WebMidCamp - Midwest Drupal Camp: Catch up on all the MidCamp you missed!
Watch the Dries fireside chat from 2025, or catch up on all of the sessions from last year on Drupal.tv.
Theres even more Drupal goodness to be had in our archives or Drupal.tv's.
The Archives: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014
Dries Buytaert: Drupal CMS 2.0 released
Today we released Drupal CMS 2.0. I've been looking forward to this release for a long time!
If Drupal is 25 years old, why only version 2.0? Because Drupal Core is the same powerful platform you've known for years, now at version 11. Drupal CMS is a product built on top of it, packaging best-practice solutions and extra features to help you get started faster. It was launched a year ago as part of Drupal Starshot.
Why build this layer at all? Because the criticism has been fair: Drupal is powerful but not easy. For years, features like easier content editing and better page building have topped the wishlist.
Drupal CMS is changing Drupal's story from powerful but hard to powerful and easy to use.
With Drupal CMS 2.0, we're taking another big step forward. You no longer begin with a blank slate. You can begin with site templates designed for common use cases, then shape them to fit your needs. You get a visual page builder, preconfigured content types, and a smoother editing experience out of the box. We also added more AI-powered features to help draft and refine content.
The biggest new feature in this release is Drupal Canvas, our new visual page builder that now ships by default with Drupal CMS 2.0. You can drag components onto a page, edit in place, and undo changes. No jumping between forms and preview screens.
WordPress and Webflow have shown how powerful visual editing can be. Drupal Canvas brings that same ease to Drupal with more power while keeping its strengths: custom content types, component-based layouts, granular permissions, and much more.
But Drupal Canvas is only part of the story. What matters more is how these pieces are starting to fit together, in line with the direction we set out more than a year ago: site templates to start from, a visual builder to shape pages, better defaults across the board, and AI features that help you get work done faster. It's the result of a lot of hard work by many people across the Drupal community.
If you tried Drupal years ago and found it too complex, I'd love for you to give it another look. Building a small site with a few landing pages, a campaign section, and a contact form used to take a lot of setup. With Drupal CMS 2.0, you can get something real up and running much faster than before.
For 25 years, Drupal traded ease for power and flexibility. That is finally starting to change, while keeping the power and flexibility that made Drupal what it is. Thank you to everyone who has been pushing this forward.
The Drop Times: Zoocha Rebrands as Digital Experience Agency Powered by Drupal
Drupal blog: Drupal CMS 2.0 is here: Visual building, AI, and site templates transform Drupal
January 28, 2026 – Today marks one of the biggest evolutions in Drupal's 25-year history.
Drupal CMS 2.0 launches with Drupal Canvas, AI-powered tools, and introduces a component system along with the first site template that enables marketing teams to launch fully branded, professional websites in days instead of weeks. Built on Drupal core, it maintains the enterprise-grade security, scalability, and flexibility Drupal is known for.
Try it now → drupal.org/drupal-cms
What's in 2.0Drupal CMS 2.0 is built on top of Drupal Core 11.3, which is the biggest performance improvement in a decade, allowing you to serve 26-33% more requests with the same setup.
We are introducing Drupal Canvas as the default editing experience. Drag components onto pages with live preview and real-time editing. No more switching between admin forms and preview windows for your landing pages – build directly on the page. No Drupal knowledge required to get started.
The custom built Mercury component library provides common building blocks like cards, testimonials, heroes, menus and accordions.
We are introducing site templates that provide feature-complete starting points for specific use cases. Byte is the first template included with Drupal CMS 2.0. It is preconfigured as a marketing site for a SaaS-based product, with blog, newsletter signup, pricing pages, and a contact form, with an elegant dark design. All built with Canvas. Installs in under 3 minutes.
Recipe-based integrations automate complex configurations:
- Mailchimp integration, automatically grabs audiences from your instance after you authenticate, and creates signup form blocks ready to drop into Canvas pages
- Recipe system turns "how did I do this last time?" into one-click operations
AI tools (optional):
- Generate complete pages from text prompts using all available Canvas components
- Admin chatbot helps with site-building tasks like creating content types, defining taxonomy terms, and adding fields - guiding you from intent to configuration faster
- AI-assisted alt text generation for images improves accessibility across your site while allowing human review
- Built-in support for amazee.ai Private AI Provider (free tokens included), plus OpenAI and Anthropic - no complex setup required
- AI Dashboard provides central visibility into available AI features and configured providers
Plus all of these proven goodies from Drupal CMS 1 (January 2025):
- Streamlined installer with smart defaults
- Project Browser for discovering and installing modules
- Automatic updates for security patches
- Recipes system for packaging and sharing configurations
- Modern admin UI with Gin theme
- SEO tools out-of-the-box
- Accessibility checking built-in
- Data privacy compliance features
Drupal CMS 2.0 would not have been possible without the innovations in Drupal core and the visual tools and components built specifically for this release. Thanks to the hundreds of contributors across dozens of organizations. Special thanks to the AI initiative partners, and everyone who tested, filed issues, and pushed boundaries outward.
This is community-driven development at scale.
Download and get startedTry it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS builds on Drupal Core with full ecosystem compatibility, adding visual building tools, AI assistance, and industry-specific templates. Learn more →
Web Wash: Getting Started with DDEV for Drupal Development
Setting up a local Drupal development environment requires tools that handle web servers, databases, and PHP configuration. DDEV provides a Docker-based solution that simplifies this process while maintaining flexibility for different project requirements.
In the video above, you'll learn how to install and configure DDEV, create a new Drupal project, use essential commands for daily development, import and export databases, set up debugging with Xdebug, and extend DDEV with add-ons and custom commands.
Pivale: Who really owns your digital platforms?
The Drop Times: Dependency, Not Geography, is the Risk!
Europe’s push for digital sovereignty is gaining momentum, but much of the conversation remains superficial. Drawing on the recent analysis by Dries Buytaert, founder of Drupal, the real issue is not whether governments use European or non-European vendors—it’s whether they retain meaningful control over the software that underpins public services. Dependency, not geography, is the risk. Several public institutions are beginning to act on this insight, but the structural implications remain largely unaddressed.
Dries' argument reframes open source from a technical preference into a governance imperative. Open source offers auditability, portability, and independence that proprietary systems cannot. Yet, while Europe’s public sector heavily relies on open source, it consistently fails to invest in its foundations. Procurement practices continue to channel funding toward large integrators and resellers, leaving the maintainers who secure and evolve the software underfunded and overstretched.
The result is a stark mismatch between policy ambitions and spending realities. Governments pay for delivery and compliance but neglect the upstream work that ensures long-term security, resilience, and innovation. As Buytaert makes clear, digital sovereignty won’t be achieved through strategy papers alone. It demands procurement policies that treat open-source contributions as a core public value—not an optional extra.
With that, let's move on to the important stories from the past week.
DRUPAL COMMUNITYDISCOVER DRUPAL- Drupal Migrate Plus 6.0.9 Requires PHP Attributes Over Annotations
- Drupal AI Adds mittwald Provider v1.0 to Support Hosted AI Models
- Display Builder Beta 1 Unifies Layout Interfaces for Drupal Site Builders
- Augusto Fagioli Releases Business Identity 1.0.0 to Streamline Company Data in Drupal
- Creodrop Promises One-Click Drupal Hosting Without the DevOps Headache
- Upcoming Zoocha Webinar Showcases Global Touring’s Drupal Success
- Drupal Powers EU Open Source Week 2026 With Policy, Innovation, and Community Events
- AmyJune Hineline Announced as Featured Speaker for Florida DrupalCamp 2026
- DrupalCamp Grenoble 2026 Opens Sponsorship Packages
- Dripyard and Lullabot to Deliver Future‑Proof Theming Training at DrupalCon Chicago 2026
- Drupal Iberia 2026 Set for 8–9 May in Braga, Tickets Now Available
We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now. To get timely updates, follow us on LinkedIn , Twitter , Bluesky , and Facebook . You can also join us on Drupal Slack at #thedroptimes .
Thank you.
Alka Elizabeth
Sub-editor
The DropTimes
Specbee: Drupal consulting explained: What it costs, what you gain, and how to pick the right Drupal partner
Pivale: Introducing Commerce Referral
Provides a referral system for Drupal Commerce that allows customers to refer friends and receive rewards.
Dominique De Cooman: Becoming the Intelligent Open Digital Experience Company
In the Dropsolid diaries series, I talk in-depth about the journey of Dropsolid company that has Drupal at its core. It contains Drupal insights, company insights, personal experiences, DXP and CMS market insights, and many other learnings I learned as the founder of Dropsolid & Dropsolid AI.
Becoming the Intelligent Open Digital Experience CompanyAITuesday, January 27, 2026 - 10:05Drupal.org blog: GitLab issue migration: the new workflow for migrated projects
As we mentioned in our last blog post GitLab issue migration: immediate changes, we will continue to migrate more and more projects.
We gathered a list of projects where their maintainers agreed to help us test the migration process at #3409678: Opt-in GitLab issues. What does it mean if your project is being migrated or if you are collaborating in one of those migrated projects?
Changes to issue managementIf your project has been migrated to GitLab, you will now manage all your issues via GitLab issue listing and/or issue boards. As maintainers, you will be able to set up issue boards to follow the workflow that makes the most sense for your project. Some projects might just have "Open" and "Closed" columns (default setup), some projects might want to add a "RTBC" column based on the existing "state::rtbc" label, some projects might want to define more complex issue transitions. This is something similar to what we did on the transition to GitLab CI, where we provide defaults for all projects, but then each maintainer can configure their own ways of managing their issues.
As with other open source projects, only maintainers will be able to configure the issue boards, set labels for the issues or even change issue status. This is a big workflow change from what we have now, but it aligns with how many other projects are managed.
All labels (tags, version, priority, etc) are now project-specific, giving maintainers full freedom to choose the ones that make the most sense for their projects.
Fork managementWhilst using GitLab issues brings us closer to workflows in other communities, our forking model remains the same as it was until now, which is collaborative by default. We believe that this is the easiest way to work together as a community.
This means that we will not have personal forks (we never have), and we will continue having shared forks (we always have). GitLab does not support this forking model out of the box, so we needed to implement this capability in the new system. As we did so, we used the opportunity to simplify the process compared to that of Drupal.org issues.
We will have a new place to create forks and request access, which will be a new tab available when viewing the contribution record for the issue. This new tab will read 100% of its information from GitLab via Ajax. You can do the same things as you can now on Drupal.org issues: create forks and request access. You can even do some of these things from the issue page (more about this below).
Actions like creating branches or merge requests will be just links to GitLab, as that's something that can already be done there.
Automated messagesWe understand that the above includes a new step in the workflow, which we had before within the issue page. In order to make the workflow easier, we are adding automated messages to issues that will take you back and forth between the pages, that will inform about forks created, etc.
What's not changing?The contribution records system that we deployed a few months ago will not change, it will remain exactly the same as it is today. You will have links to go back and forth between the issues and their contribution record, the same way as you have right now with Drupal.org issues.
What's next?The roadmap remains unchanged, and still is (in each iteration, we will address feedback, fix bugs...):
- Migrate projects that opted in (we are here now)
- Make this the default for new projects
- Migrate low-risk, low-usage, and/or sandbox projects
- Migrate remaining projects, excluding a few selected high-volume, high-risk
- Migrate the rest of the projects, including core
ImageX: Mastering Robots.txt: An Essential SEO Tool for Your Drupal Site
When we think of robots, we often picture shiny machines whirring around in sci‑fi movies, or perhaps we think of something that is gradually becoming part of our reality. But not all robots are mechanical. In the world of SEO, search engine bots are tiny robots exploring your Drupal website, and with the right guidance, you can make sure they stick to the paths that matter.
drunomics: drunomics joins the Drupal AI Initiative as Silver Maker
Talking Drupal: Talking Drupal #537 - Orchestration
Today we are talking about Integrations into Drupal, Automation, and Drupal with Orchestration with guest Jürgen Haas. We'll also cover CRM as our module of the week.
For show notes visit: https://www.talkingDrupal.com/537
Topics- Understanding Orchestration
- Orchestration in Drupal
- Introduction to Orchestration Services
- Drupal's Role in Orchestration
- Flexibility in Integration
- Orchestration Module in Drupal
- Active Pieces and Open Source Integration
- Security Considerations in Orchestration
- Future of Orchestration in Drupal
- Getting Involved with Orchestration
- Orchestration
- N8N
- Drupal as an application
- Tools
Jürgen Haas - lakedrops.com jurgenhaas
HostsNic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi
MOTW CorrespondentMartin Anderson-Clutz - mandclu.com mandclu
- Brief description:
- Have you ever wanted a Drupal-native way to store, manage, and interact with people who might not all be registered users? There's a module for that.
- Module name/project name:
- Brief history
- How old: created in Apr 2007 by Allie Micka, but the Steve Ayers aka bluegeek9 took over the namespace
- Versions available: 1.0.0-beta2, which works with Drupal 11.1 or newer
- Maintainership
- Actively maintained, latest release just a day ago
- Security coverage: opted in, but needs a stable release
- Test coverage
- Number of open issues: 73 open issues, but all bugs have been marked as fixed
- Usage stats:
- 10 sites
- Module features and usage
- Listeners may remember some mention of the CRM module in the conversation about the Member Platform initiative back in episode 512
- As a reminder, something other than standard Drupal user accounts is useful for working with contact information for people where you may not have all the criteria necessary for a Drupal user account, for example an email address. Also, a dedicated system can make it easier to model relationships between contacts, and provide additional capabilities.
- It's worth noting that this module defines CRM as Contact Relationship Management, not assuming that the data is associated with "customers" or "constituents" as some other solutions do
- At its heart, CRM defines three new entity types: contacts, contact methods, and relationships. Each of these can have fieldable bundles, and provides some default examples: Person, Household, and Organization for contacts; Address, Email, and Telephone for contact methods; and Head of household, Spouse, Employee, and Member for relationships
- Out of the box CRM includes integrations with other popular modules like Group and Context, in addition to a variety of Drupal core systems like views and search
- As previously mentioned CRM is intended to be the foundational data layer of the Member Platform, but is also a key element of the Open Knowledge distribution, meant to allow using Drupal as a collaborative knowledge base and learning platform
Droptica: How to Choose Drupal Hosting? Avoid Costly Mistakes
Which hosting to select for Drupal? This is one of the most frequently asked questions among people starting their work with this CMS. In this article, I'll explain what to pay attention to when choosing Drupal hosting and provide a brief overview of available options – based on 15 years of experience implementing Drupal for clients from Poland and abroad. I invite you to read the post or watch the [ from the Nowoczesny Drupal series.
Très Bien Blog: …and now I'm recovering
After just a month of use I can see that my relationship with Claude Code is unhealthy. Like I mentioned when I tried Claude Code for a month, even when it was wasting my time I was having fun. Pretty big red flag.
theodore January 26, 2026Factorial.io: 25 Years of Drupal: From an Open Source Project to the Backbone of Digital Ecosystems
On January 15, Drupal turned 25 years old. What began in 2001 as a simple open source experiment by founder Dries Buytaert in Antwerp is now one of the most powerful frameworks for complex digital platforms worldwide. Drupal has grown up. And with it, the requirements for digital products.
Dries Buytaert: Automatically exporting my Drupal content to GitHub
This note is mostly for my future self, in case I need to set this up again. I'm sharing it publicly because parts of it might be useful to others, though it's not a complete tutorial since it relies on a custom Drupal module I haven't released.
For context: I switched to Markdown and then open-sourced my blog content by exporting it to GitHub. Every day, my Drupal site exports its content as Markdown files and commits any changes to github.com/dbuytaert/website-content. New posts appear automatically, and so do edits and deletions.
Creating the GitHub repositoryCreate a new GitHub repository. I called mine website-content.
Giving your server access to GitHubFor your server to push changes to GitHub automatically, you need SSH key authentication.
SSH into your server and generate a new SSH key pair:
ssh-keygen -t ed25519 -f ~/.ssh/github -N ""This creates two files: ~/.ssh/github (your private key that stays on your server) and ~/.ssh/github.pub (your public key that you share with GitHub).
The -N "" creates the key without a passphrase. For automated scripts on secured servers, passwordless keys are standard practice. The security comes from restricting what the key can do (a deploy key with write access to one repository) rather than from a passphrase.
Next, tell SSH to use this key when connecting to GitHub:
cat >> ~/.ssh/config << 'EOF' Host github.com IdentityFile ~/.ssh/github IdentitiesOnly yes EOFAdd GitHub's server fingerprint to your known hosts file. This prevents SSH from asking "Are you sure you want to connect?" when the script runs:
ssh-keyscan github.com >> ~/.ssh/known_hostsDisplay your public key so you can copy it:
cat ~/.ssh/github.pubIn GitHub, go to your repository's "Settings", find "Deploy keys" in the sidebar, and click "Add deploy key". Check the box for "Allow write access".
Test that everything works:
ssh -T git@github.comYou should see: You've successfully authenticated, but GitHub does not provide shell access.
The export scriptI created the following export script:
#!/bin/bash set -e TEMP=/tmp/dries-export # Clone the existing repository git clone git@github.com:dbuytaert/website-content.git $TEMP cd $TEMP # Clean all directories so moved/deleted content is tracked rm -rf */ # Export fresh content older than 2 days drush node:export --end-date="2 days ago" --destination=$TEMP # Commit and push if there are changes git config user.email "dries+bot@buytaert.net" git config user.name "Dries Bot" git add -A git diff --staged --quiet || { git commit -m "Automatic updates for $(date +%Y-%m-%d)" git push } rm -rf $TEMPThe drush node:export command comes from a custom Drupal module I built for my site. I have not published the module on Drupal.org because it's specific to my site and not reusable as is. I wrote about why that kind of code is still worth sharing as adaptable modules, and I hope to share it once Drupal.org has a place for them.
The two-day delay (--end-date="2 days ago") gives me time to catch typos before posts are archived to GitHub. I usually find them right after hitting publish.
The git add -A stages everything including deletions, so if I remove a post from my site, it disappears from GitHub too (though Git's history preserves it).
Scheduling the exportOn a traditional server, you'd add this script to Cron to run daily. My site runs on Acquia Cloud, which is Kubernetes-based and automatically scales pods up and down based on traffic. This means there is no single server to put a crontab on. Instead, Acquia Cloud provides a scheduler that runs jobs reliably across the infrastructure.
And yes, this note about automatically backing up my content will itself be automatically backed up.
Dripyard Premium Drupal Themes: How an unclosed <em> broke Drupal’s JavaScript
Sometimes you hit a bug and your brain just goes, “huh.”
That was me earlier this week while trying to figure out why Drupal’s JavaScript was completely broken. But only on one page. And of course, this happened during a live demo!
You can actually see the moment it went sideways here. This is the story of how I tracked it down.
The problemDripyard adds a bunch of options to our theme settings pages. On one particular theme, Great Lakes, the settings page was loading with JavaScript absolutely wrecked.