Übersicht

Vorschläge max.2 pro Tag

Platz für Vorschläge, Fragen, Anderes

Wenn sie Antworten erhalten wollen tragen sie hier Kontaktdaten wie email-Adresse oder Telefonnummer oder Postanschrift ein

CAPTCHA
Sicherheitscheck: Tragen sie die abgebildeten Buchstaben und/oder Zahlen hier unter in das freie Feld ein.
Image CAPTCHA
Enter the characters shown in the image.

Linux - here we go

Umfrage

Wie gefällt euch/ihnen diese Seite:

Vorschläge und Wünsche bitte an: support@webjoke.de.

Benutzeranmeldung

CAPTCHA
Sicherheitscheck: Tragen sie die abgebildeten Buchstaben und/oder Zahlen hier unter in das freie Feld ein.
Image CAPTCHA
Enter the characters shown in the image.
Drupal.org - aggregated feeds in category Planet Drupal
Aktualisiert: vor 6 Stunden 41 Minuten

Dries Buytaert: Effortless inspecting of Drupal database queries

Do, 03/14/2024 - 15:04

Drupal's database abstraction layer is great for at least two reasons. First, it ensures compatibility with various database systems like MySQL, MariaDB, PostgreSQL and SQLite. Second, it improves security by preventing SQL injection attacks.

My main concern with using any database abstraction layer is the seemingly reduced control over the final SQL query. I like to see and understand the final query.

To inspect and debug database queries, I like to use MariaDB's General Query Log. When enabled, this feature captures every query directly in a text file.

I like this approach for a few reasons:

  • It directly captures the query in MariaDB instead of Drupal. This provides the most accurate view of the final SQL queries.
  • It enables real-time monitoring of all queries in a separate terminal window, removing the need to insert debug statements into code.
  • It records both successful and unsuccessful queries, making it a handy tool for debugging purposes.

If you're interested in trying it out, here is how I set up and use MariaDB's General Query Log.

First, log in as the MySQL super user. Since I'm using DDEV for my development, the command to do that looks like this:

[code bash]$ ddev mysql -u root -proot[/code]

After logging in as the MySQL super user, you need to set a few global variables:

[code shell]SET global general_log = 1; SET global log_output = 'file'; SET global general_log_file = '/home/dries/queries.txt';[/code]

This will start logging all queries to a file named 'queries.txt' in my home directory.

DDEV employs containers, isolated environments designed for operating specific software such as databases. To view the queries, you must log into the database container.

[code bash]$ ddev ssh -s db[/code]

Now, you can easily monitor the queries in real-time from a separate terminal window:

[code bash]$ tail -f /home/dries/queries.txt[/code]

This is a simple yet effective way to monitor all database queries that Drupal generates.

Kategorien: Drupal News

Dries Buytaert: The new old: Jamstack and MACH's journey towards traditional CMS concepts

Do, 03/14/2024 - 15:04

In recent years, new architectures like MACH and Jamstack have emerged in the world of Content Management Systems (CMS) and Digital Experience Platforms (DXP).

In some way, they have challenged traditional models. As a result, people sometimes ask for my take on MACH and Jamstack. I've mostly refrained from sharing my perspective to avoid controversy.

However, recognizing the value of diverse viewpoints, I've decided to share some of my thoughts. I hope it contributes positively to the ongoing evolution of these technologies.

The Jamstack is embracing its inner CMS

Jamstack, born in 2015, began as an approach for building static sites. The term Jamstack stands for JavaScript, APIs and Markup. Jamstack is an architectural approach that decouples the web experience layer from content and business logic. The web experience layer is often pre-rendered as static pages (markup), served via a Content Delivery Network (CDN).

By 2023, the Jamstack community underwent a considerable transformation. It evolved significantly from its roots, increasingly integrating traditional CMS features, such as "content previews". This shift is driven by the need to handle more complex websites and to make the Jamstack more user-friendly for marketers, moving beyond its developer-centric origins.

Netlify, a leader in the Jamstack community, has acknowledged this shift. As part of that, they are publicly moving away from the term "Jamstack" towards being a "composable web platform".

Many traditional CMSes, including Drupal, have long embraced the concept of "composability". For example, Drupal has been recognized as a composable CMS for two decades and offers advanced composable capabilities.

This expansion reflects a natural progression in technology, where software tends to grow more complex and multifaceted over time, often referred to as the law of increasing complexity. What starts simple becomes more complex over time.

I believe the Jamstack will continue adding traditional CMS features such as menu management, translation workflows, and marketing technology integrations until they more closely resemble traditional CMSes.

Wake-up call: CMSes are hybrid now

The Jamstack is starting to look more like traditional CMSes, not only in features, but also in architecture. Traditional CMSes can work in two main ways: "coupled" or "integrated", where they separate the presentation layer from the business logic using an approach called Model-View-Controller (MVC), or "decoupled", where the front-end and back-end are split using web service APIs, like the Jamstack does.

Modern CMSes integrate well with various JavaScript frameworks, have GraphQL backends, can render static pages, and much more.

This combination of both methods is known as "hybrid". Most traditional CMSes have adopted this hybrid approach. For example, Drupal has championed a headless approach for over a decade, predating both the terms "Jamstack" and "Headless".

Asserting that traditional CMSes are "monolithic" and "outdated" is a narrow-minded view held by people who have overlooked their evolution. In reality, today's choice is between a purely Headless or a Hybrid CMS. Redefining "Jamstack" beyond overstated claims

As the Jamstack becomes more versatile, the original "Jamstack" name and definition feel restrictive. The essence of Jamstack, once centered on creating faster, secure websites with a straightforward deployment process, is changing.

It's time to move beyond some of the original value proposition, not just because of its evolution, but also because many of the Jamstack's advantages have been overstated for years:

In short, Jamstack, initially known for its static site generation and simplicity, is growing into something more dynamic and complex. This is a positive evolution driven by the market's need. This evolution narrows the gap between Jamstack and traditional CMSes, though they still differ somewhat – Jamstack offers a pure headless approach, while traditional CMSes offer both headless and integrated options.

Last but not least, this shift is making Jamstack more similar to MACH, leading to my opinion on the latter.

The key difference between MACH and traditional CMSes

Many readers, especially people in the Drupal community, might be unaware of MACH. MACH is an acronym for Microservices, API-first, Cloud-native SaaS, Headless. It specifies an architectural approach where each business function is a distinct cloud service, often created and maintained by separate vendors and integrated by the end user.

Imagine creating an online store with MACH certified solutions: you'd use different cloud services for each aspect – one for handling payments, another for your product catalog, and so on. A key architectural concept of MACH is that each of these services can be developed, deployed, and managed independently.

At first glance, that doesn't look too different from Drupal or WordPress. Platforms like WordPress and Drupal are already integration-rich, with Drupal surpassing 50,000 different modules in 2023. In fact, some of these modules integrate Drupal with MACH services. For example, when building an online store in Drupal or WordPress, you'd install modules that connect with a payment service, SaaS-based product catalog, and so on.

At a glance, this modular approach seems similar to MACH. Yet, there is a distinct contrast: Drupal and WordPress extend their capabilities by adding modules to a "core platform", while MACH is a collection of independent services, operating without relying on an underlying core platform.

MACH could benefit from "monolithic" features

Many in the MACH community applaud the lack of a core platform, often criticizing it as "monolithic". Calling a CMS like Drupal "monolithic" is, in my opinion, a misleading and simplistic marketing strategy. From a technical perspective, Drupal's core platform is exceptionally modular, allowing all components to be swapped.

Rather than (mis)labeling traditional CMSes as "monolithic", a more accurate way to explain the difference between MACH and traditional CMSes is to focus on the presence or absence of a "core platform".

More importantly, what is often overlooked is the vital role a core platform plays in maintaining a consistent user experience, centralized account management, handling integration compatibility and upgrades, streamlining translation and content workflows across integrations, and more. The core platform essentially offers "shared services" aimed to help improve the end user and developer experience. Without a core platform, there can be increased development and maintenance costs, a fragmented user experience, and significant challenges in composability.

This aligns with a phenomenon that I call "MACH fatigue". Increasingly, organizations come to terms with the expenses of developing and maintaining a MACH-based system. Moreover, end-users often face a fragmented user experience. Instead of a seamlessly integrated interface, they often have to navigate multiple disconnected systems to complete their tasks.

To me, it seems that an ideal solution might be described as "loosely-coupled architectures with a highly integrated user experience". Such architectures allow the flexibility to mix and match the best systems (like your preferred CMS, CRM, and commerce platform). Meanwhile, a highly integrated user experience ensures that these combinations are seamless, not just for the end users but also for content creators and experience builders.

At present, traditional platforms like Drupal are closer to this ideal compared to MACH solutions. I anticipate that in the coming years, the MACH community will focus on improving the cost-effectiveness of MACH platforms. This will involve creating tools for managing dependencies and ensuring version compatibility, similar to the practices embraced by Drupal with Composer.

Efforts are already underway to streamline the MACH user experience with "Digital Experience Composition (DXC) tools". DXC acts as a layer over a MACH architecture, offering a user interface that breaks down various MACH elements into modular "Lego blocks". This allows both developers and marketers to assemble these blocks into a digital experience. Users familiar with traditional CMSes might find this a familiar concept, as many CMS platforms include DXC elements as integral features within their core platform.

Traditional CMSes like Drupal can also take cues from Jamstack and MACH, particularly in their approaches to web services. For instance, while Drupal effectively separates business logic from the presentation layer using the MVC pattern, it primarily relies on internal APIs. A shift towards more public web service APIs could enhance Drupal's flexibility and innovation potential.

Conclusions

In recent years, we've witnessed a variety of technical approaches in the CMS/DXP landscape, with MACH, Jamstack, decoupled, and headless architectures each carving out their paths.

Initially, these paths appeared to diverge. However, we're now seeing a trend of convergence, where different systems are learning from each other and integrating their unique strengths.

The Jamstack is evolving beyond its original focus on static site generation into a more dynamic and complex approach, narrowing the gap with traditional CMSes.

Similarly, to bridge the divide with traditional CMSes, MACH may need to broaden its scope to encompass shared services commonly found in the core platform of traditional CMSes. That would help with developer cost, composability and user-friendliness.

In the end, the success of any platform is judged by how effectively it delivers a good user experience and cost efficiency, regardless of its architecture. The focus needs to move away from architectural considerations to how these technologies can create more intuitive, powerful platforms for end users.

Kategorien: Drupal News

MidCamp - Midwest Drupal Camp: Beware the ides of March!

Do, 03/14/2024 - 03:01
Beware the ides of March!

With just one week until we meet in person for MidCamp 2024, book your ticket before the standard discount pricing ends on March 14, 11.59pm CT to save $100!

Session Schedule

We’ve got a great line-up this year! All sessions on Wednesday and Thursday (March 20/21) are included in the price of your MidCamp registration. We encourage you to start planning your days -- and get ready for some great learning opportunities!

Expanded Learning Tickets

Know any students or individuals seeking to expand their Drupal knowledge?

We have heavily discounted tickets ($25!) available for students and those wanting to learn more about Drupal to join us at MidCamp and learn more about the community!

There are sessions for everyone—topics range from Site Building and DevOps to Project Management and Design.

Get your tickets now!

Volunteer for some exclusive swag!

If you know you'll be joining us in Chicago, why not volunteer some of your spare time and get some exclusive swag in return! Check out our non-code opportunities to get involved.

Stay In The Loop

Please feel free to ask on the MidCamp Slack and come hang out with the community online. We will be making announcements there from time to time. We’re also on Twitter and Mastodon.

We can’t wait to see you next week! 

The MidCamp Team

Kategorien: Drupal News

The Drop Times: Zoocha: The Drupal Development Specialists in the UK

Mi, 03/13/2024 - 23:34
Discover the journey of Zoocha, a leading Drupal Development Agency renowned for its commitment to innovation, sustainability, and client success. Since its inception in 2009, Zoocha has demonstrated a profound dedication to open-source technology and agile methodologies, earning notable certifications and accolades. With a portfolio boasting collaborations with high-profile clients across various sectors, Zoocha excels in delivering customized Drupal solutions that meet unique client needs.

This article delves into Zoocha's strategies for engaging talent, promoting Drupal among younger audiences, and its ambitious goal to achieve carbon neutrality by 2025. Explore how Zoocha's commitment to quality, security, and environmental sustainability positions it as a trusted partner in the dynamic world of web development.
Kategorien: Drupal News

Four Kitchens: Custom Drush commands with Drush Generate

Mi, 03/13/2024 - 20:59

Marc Berger

Senior Backend Engineer

Always looking for a challenge, Marc tries to add something new to his toolbox for every project and build — be it a new CSS technology, creating custom APIs, or testing out new processes for development.

January 1, 1970

Recently, one of our clients had to retrieve some information from their Drupal site during a CI build. They needed to know the internal Drupal path from a known path alias. Common Drush commands don’t provide this information directly, so we decided to write our own custom Drush command. It was a lot easier than we thought it would be! Let’s get started.

Note: This post is based on commands and structure for Drush 12.

While we can write our own Drush command from scratch, let’s discuss a tool that Drush already provides us: the drush generate command. Drush 9 added support to generate scaffolding and boilerplate code for many common Drupal coding tasks such as custom modules, themes, services, plugins, and many more. The nice thing about using the drush generate command is that the code it generates conforms to best practices and Drupal coding standards — and some generators even come with examples as well. You can see all available generators by simply running drush generate without any arguments.

Step 1: Create a custom module

To get started, a requirement to create a new custom Drush command in this way is to have an existing custom module already in the codebase. If one exists, great. You can skip to Step 2 below. If you need a custom module, let’s use Drush to generate one:

drush generate module

Drush will ask a series of questions such as the module name, the package, any dependencies, and if you want to generate a .module file, README.md, etc. Once the module has been created, enable the module. This will help with the autocomplete when generating the custom Drush command.

drush en <machine_name_of_custom_module>

Step 2: Create custom Drush command boilerplate

First, make sure you have a custom module where your new custom Drush command will live and make sure that module is enabled. Next, run the following command to generate some boilerplate code:

drush generate drush:command-file

This command will also ask some questions, the first of which is the machine name of the custom module. If that module is enabled, it will autocomplete the name in the terminal. You can also tell the generator to use dependency injection if you know what services you need to use. In our case, we need to inject the path_alias.manager service. Once generated, the new command class will live here under your custom module:

<machine_name_of_custom_module>/src/Drush/Commands

Let’s take a look at this newly generated code. We will see the standard class structure and our dependency injection at the top of the file:

<?php namespace Drupal\custom_drush\Drush\Commands; use Consolidation\OutputFormatters\StructuredData\RowsOfFields; use Drupal\Core\StringTranslation\StringTranslationTrait; use Drupal\Core\Utility\Token; use Drupal\path_alias\AliasManagerInterface; use Drush\Attributes as CLI; use Drush\Commands\DrushCommands; use Symfony\Component\DependencyInjection\ContainerInterface; /** * A Drush commandfile. * * In addition to this file, you need a drush.services.yml * in root of your module, and a composer.json file that provides the name * of the services file to use. */ final class CustomDrushCommands extends DrushCommands { use StringTranslationTrait; /** * Constructs a CustomDrushCommands object. */ public function __construct( private readonly Token $token, private readonly AliasManagerInterface $pathAliasManager, ) { parent::__construct(); } /** * {@inheritdoc} */ public static function create(ContainerInterface $container) { return new static( $container->get('token'), $container->get('path_alias.manager'), ); }

Note: The generator adds a comment about needing a drush.services.yml file. This requirement is deprecated and will be removed in Drush 13, so you can ignore it if you are using Drush 12. In our testing, this file does not need to be present.

Further down in the new class, we will see some boilerplate example code. This is where the magic happens:

/** * Command description here. */ #[CLI\Command(name: 'custom_drush:command-name', aliases: ['foo'])] #[CLI\Argument(name: 'arg1', description: 'Argument description.')] #[CLI\Option(name: 'option-name', description: 'Option description')] #[CLI\Usage(name: 'custom_drush:command-name foo', description: 'Usage description')] public function commandName($arg1, $options = ['option-name' => 'default']) { $this->logger()->success(dt('Achievement unlocked.')); }

This new Drush command doesn’t do very much at the moment, but provides a great jumping-off point. The first thing to note at the top of the function are the new PHP 8 attributes that begin with the #. These replace the previous PHP annotations that are commonly seen when writing custom plugins in Drupal. You can read more about the new PHP attributes.

The different attributes tell Drush what our custom command name is, description, what arguments it will take (if any), and any aliases it may have.

Step 3: Create our custom command

For our custom command, let’s modify the code so we can get the internal path from a path alias:

/** * Command description here. */ #[CLI\Command(name: 'custom_drush:interal-path', aliases: ['intpath'])] #[CLI\Argument(name: 'pathAlias', description: 'The path alias, must begin with /')] #[CLI\Usage(name: 'custom_drush:interal-path /path-alias', description: 'Supply the path alias and the internal path will be retrieved.')] public function getInternalPath($pathAlias) { if (!str_starts_with($pathAlias, "/")) { $this->logger()->error(dt('The alias must start with a /')); } else { $path = $this->pathAliasManager->getPathByAlias($pathAlias); if ($path == $pathAlias) { $this->logger()->error(dt('There was no internal path found that uses that alias.')); } else { $this->output()->writeln($path); } } //$this->logger()->success(dt('Achievement unlocked.')); }

What we’re doing here is changing the name of the command so it can be called like so:

drush custom_drush:internal-path <path> or via the alias: drush intpath <path>

The <path> is a required argument (such as /my-amazing-page) because of how it is called in the getInternalPath method. By passing a path, this method first checks to see if the path starts with /. If it does, it will perform an additional check to see if there is a path that exists. If so, it will return the internal path, i.e., /node/1234. Lastly, the output is provided by the logger method that comes from the inherited DrushCommands class. It’s a simple command, but one that helped us automatically set config during a CI job.

Table output

Note the boilerplate code also generated another example below the first — one that will provide output in a table format:

/** * An example of the table output format. */ #[CLI\Command(name: 'custom_drush:token', aliases: ['token'])] #[CLI\FieldLabels(labels: [ 'group' => 'Group', 'token' => 'Token', 'name' => 'Name' ])] #[CLI\DefaultTableFields(fields: ['group', 'token', 'name'])] #[CLI\FilterDefaultField(field: 'name')] public function token($options = ['format' => 'table']): RowsOfFields { $all = $this->token->getInfo(); foreach ($all['tokens'] as $group => $tokens) { foreach ($tokens as $key => $token) { $rows[] = [ 'group' => $group, 'token' => $key, 'name' => $token['name'], ]; } } return new RowsOfFields($rows); }

In this example, no argument is required, and it will simply print out the list of tokens in a nice table:

------------ ------------------ ----------------------- Group Token Name ------------ ------------------ ----------------------- file fid File ID node nid Content ID site name Name ... ... ... Final thoughts

Drush is a powerful tool, and like many parts of Drupal, it’s expandable to meet different needs. While I shared a relatively simple example to solve a small challenge, the possibilities are open to retrieve all kinds of information from your Drupal site to use in scripting, CI/CD jobs, reporting, and more. And by using the drush generate command, creating these custom solutions is easy, follows best practices, and helps keep code consistent.

Further reading

The post Custom Drush commands with Drush Generate appeared first on Four Kitchens.

Kategorien: Drupal News

Tag1 Consulting: The DDEV Local Development Environment: Talking with Maintainer Randy Fay

Mi, 03/13/2024 - 13:40

Randy Fay, the maintainer of DDEV discusses the key features and functionalities of DDEV, a Docker-based development environment that streamlines setting up and managing local development for applications (no Docker knowledge is required). Whether you're creating applications in Python, PHP, or other languages, DDEV will save you tremendous time and effort. It also works great for managing multiple projects, or working with a large distributed team, ensuring everyone’s configurations remain in sync. Randy also demos DDEV, showcasing how fast and easy it is to set up a local Drupal development environment quickly. Additionally, he touches upon the history and future of DDEV, and the critical role of the DDEV user community in both supporting the project and shaping it’s development. This interview is perfect for any developer interested in efficient development tools, current DDEV users, or anyone curious about local development technologies and best practices. --- For a transcript of this video, see The DDEV Local Development Environment- Talking with Randy Fay --- ## Links - DDEV: ddev.com - Docs https://ddev.readthedocs.io - CMS Quickstarts https://ddev.readthedocs.io/en/stable/users/quickstart/ - DDEV 2023 Review https://ddev.com/blog/2023-review - [DDEV 2024 Plans](https://ddev.com/blog/2024-plans...

Read more michaelemeyers Wed, 03/13/2024 - 04:40
Kategorien: Drupal News

Talking Drupal: Skills Upgrade 2

Mi, 03/13/2024 - 06:00

Welcome back to “Skills Upgrade” a Talking Drupal mini-series following the journey of a D7 developer learning D10. This is episode 2.

Topics
  • Review Chad's goals for the previous week
    • DDEV Installation
    • Docker for Mac vs other options
    • IDE Setup
  • Review Chad's questions
  • Tasks for the upcoming week
    • DDEV improve performance
    • Install Drupal 10
    • Install drupal/core dependencies
    • Configure and test phpcs
    • Test phpstan
    • Set up settings.local.php
    • Install devel module
Resources

DDEV Performance DDEV Quickstart Drupal Core Dependencies How to Implement Drupal Code Standards Running PHPStan On Drupal Custom Modules Why you should care about using settings.local.php Rancher Desktop

Chad's Drupal 10 Learning Curriclum & Journal Chad's Drupal 10 Learning Notes

Hosts

AmyJune Hineline - @volkswagenchick

Guests

Chad Hester - chadkhester.com @chadkhest Mike Anello - DrupalEasy.com @ultimike

Kategorien: Drupal News

The Drop Times: The Revolutionary Impact of Gander Automated Performance Testing

Di, 03/12/2024 - 22:34
Performance is a cornerstone of user experience and operational efficiency in web development. Delve into the genesis, capabilities, and transformative potential of Gander, the automated performance testing framework for Drupal, as elucidated by seasoned contributor Nathaniel Catchpole.
Kategorien: Drupal News

Balint Pekker: The Future of Drupal

Di, 03/12/2024 - 19:56
In the world of Drupal, staying ahead of the curve is essential for building websites that are not just functional but also future-proof. As the digital landscape continues to evolve, it's crucial to explore emerging trends and technologies in Drupal development that are shaping the future. In this blog post, we'll dive into some of these exciting possibilities and discuss how they can lift Drupal websites to new heights.
Kategorien: Drupal News