Skip to content

Commit

Permalink
pw_tokenizer: Remove global_handler_with_payload facade
Browse files Browse the repository at this point in the history
It does not make sense to offer a limited number of global tokenization
handlers, since tokenization has many applications and they would not be
able to share global handlers. Instead, each tokenizer user should own
its handler function.

For pw_log_tokenized, replace the
pw_tokenizer:global_handler_with_payload facade with a pw_log_tokenized
facade. For backwards compatibility, provide implementations of the old
tokenizer global handler function in pw_log_tokenized, since that module
was its only user.

Change-Id: Ibc4e16039660c04a77bf0f3b5317d493292a7e68
Reviewed-on: https://pigweed-review.googlesource.com/c/pigweed/pigweed/+/128657
Pigweed-Auto-Submit: Wyatt Hepler <[email protected]>
Commit-Queue: Auto-Submit <[email protected]>
Reviewed-by: Keir Mierle <[email protected]>
  • Loading branch information
255 authored and CQ Bot Account committed Feb 24, 2023
1 parent f47ba18 commit 3e2806b
Show file tree
Hide file tree
Showing 38 changed files with 385 additions and 765 deletions.
7 changes: 3 additions & 4 deletions pw_assert_tokenized/docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,10 +50,9 @@ Setup
``pw_assert_LITE_BACKEND = "$dir_pw_assert_tokenized:assert_backend"`` in
your target configuration.
#. Ensure your target provides
``pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND``. By default,
pw_assert_tokenized will forward assert failures to the tokenizer handler as
logs. The tokenizer handler should check for ``LOG_LEVEL_FATAL`` and properly
divert to a crash handler.
``pw_log_tokenized_HANDLER_BACKEND``. By default, pw_assert_tokenized will
forward assert failures to the log system. The tokenizer handler should check
for ``LOG_LEVEL_FATAL`` and properly divert to a crash handler.
#. Add file name tokens to your token database. pw_assert_tokenized can't create
file name tokens that can be parsed out of the final compiled binary. The
``pw_relative_source_file_names``
Expand Down
10 changes: 5 additions & 5 deletions pw_assert_tokenized/log_handler.cc
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
#include "pw_bytes/endian.h"
#include "pw_log/log.h"
#include "pw_log_tokenized/config.h"
#include "pw_log_tokenized/handler.h"
#include "pw_log_tokenized/log_tokenized.h"
#include "pw_log_tokenized/metadata.h"
#include "pw_span/span.h"
Expand Down Expand Up @@ -63,14 +64,13 @@ extern "C" void pw_assert_tokenized_HandleCheckFailure(
// number values that would cause the bit field to overflow.
// See https://pigweed.dev/pw_log_tokenized/#c.PW_LOG_TOKENIZED_LINE_BITS for
// more info.
const pw_tokenizer_Payload payload =
pw::log_tokenized::Metadata(
PW_LOG_LEVEL_FATAL, 0, PW_LOG_FLAGS, line_number)
.value();
const uint32_t payload = pw::log_tokenized::Metadata(
PW_LOG_LEVEL_FATAL, 0, PW_LOG_FLAGS, line_number)
.value();
std::array<std::byte, sizeof(tokenized_message)> token_buffer =
pw::bytes::CopyInOrder(pw::endian::little, tokenized_message);

pw_tokenizer_HandleEncodedMessageWithPayload(
pw_log_tokenized_HandleLog(
payload,
reinterpret_cast<const uint8_t*>(token_buffer.data()),
token_buffer.size());
Expand Down
4 changes: 2 additions & 2 deletions pw_log/protobuf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,8 +93,8 @@ for encoding tokenized logs and string-based logs.
#include "pw_log/proto_utils.h"
extern "C" void pw_tokenizer_HandleEncodedMessageWithPayload(
pw_tokenizer_Payload payload, const uint8_t data[], size_t size) {
extern "C" void pw_log_tokenized_HandleLog((
uint32_t payload, const uint8_t data[], size_t size) {
pw::log_tokenized::Metadata metadata(payload);
std::byte log_buffer[kLogBufferSize];
Expand Down
11 changes: 5 additions & 6 deletions pw_log_rpc/docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ Set up the :ref:`module-pw_log_tokenized` log backend.
3. Connect the tokenized logging handler to the MultiSink
---------------------------------------------------------
Create a :ref:`MultiSink <module-pw_multisink>` instance to buffer log entries.
Then, make the log backend handler,
``pw_tokenizer_HandleEncodedMessageWithPayload``, encode log entries in the
``log::LogEntry`` format, and add them to the ``MultiSink``.
Then, make the log backend handler, :cpp:func:`pw_log_tokenized_HandleLog`,
encode log entries in the ``log::LogEntry`` format, and add them to the
``MultiSink``.

4. Create log drains and filters
--------------------------------
Expand Down Expand Up @@ -366,7 +366,6 @@ log drains and filters are set up.
#include "pw_sync/interrupt_spin_lock.h"
#include "pw_sync/lock_annotations.h"
#include "pw_sync/mutex.h"
#include "pw_tokenizer/tokenize_to_global_handler_with_payload.h"
namespace foo::log {
namespace {
Expand Down Expand Up @@ -418,8 +417,8 @@ log drains and filters are set up.
};
pw::log_rpc::FilterMap filter_map(filters);
extern "C" void pw_tokenizer_HandleEncodedMessageWithPayload(
pw_tokenizer_Payload metadata, const uint8_t message[], size_t size_bytes) {
extern "C" void pw_log_tokenized_HandleLog(
uint32_t metadata, const uint8_t message[], size_t size_bytes) {
int64_t timestamp =
pw::chrono::SystemClock::now().time_since_epoch().count();
std::lock_guard lock(log_encode_lock);
Expand Down
38 changes: 36 additions & 2 deletions pw_log_tokenized/BUILD.bazel
Original file line number Diff line number Diff line change
Expand Up @@ -37,28 +37,62 @@ pw_cc_library(
deps = [
"//pw_log:facade",
"//pw_tokenizer",
"//pw_tokenizer:global_handler_with_payload",
],
)

pw_cc_library(
name = "pw_log_tokenized",
srcs = ["log_tokenized.cc"],
deps = [
":handler",
":headers",
"//pw_log:facade",
],
)

pw_cc_library(
name = "handler_facade",
hdrs = ["public/pw_log_tokenized/handler.h"],
includes = ["public"],
deps = ["//pw_preprocessor"],
)

pw_cc_library(
name = "handler",
deps = [
":handler_facade",
"@pigweed_config//:pw_log_tokenized_handler_backend",
],
)

# There is no default backend for now.
pw_cc_library(
name = "backend_multiplexer",
visibility = ["@pigweed_config//:__pkg__"],
)

# The compatibility library is not needed in Bazel.
pw_cc_library(
name = "compatibility",
srcs = ["compatibility.cc"],
visibility = ["//visibility:private"],
deps = [
":handler_facade",
"//pw_tokenizer",
"//pw_tokenizer:global_handler_with_payload",
],
)

pw_cc_library(
name = "base64_over_hdlc",
srcs = ["base64_over_hdlc.cc"],
hdrs = ["public/pw_log_tokenized/base64_over_hdlc.h"],
includes = ["public"],
deps = [
":handler_facade",
"//pw_hdlc",
"//pw_stream:sys_io_stream",
"//pw_tokenizer:base64",
"//pw_tokenizer:global_handler_with_payload",
],
)

Expand Down
86 changes: 76 additions & 10 deletions pw_log_tokenized/BUILD.gn
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,12 @@

import("//build_overrides/pigweed.gni")

import("$dir_pw_build/facade.gni")
import("$dir_pw_build/module_config.gni")
import("$dir_pw_build/target_types.gni")
import("$dir_pw_docgen/docs.gni")
import("$dir_pw_log/backend.gni")
import("$dir_pw_log_tokenized/backend.gni")
import("$dir_pw_tokenizer/backend.gni")
import("$dir_pw_unit_test/test.gni")

Expand Down Expand Up @@ -46,13 +48,70 @@ pw_source_set("pw_log_tokenized") {
]
public_deps = [
":config",
":handler.facade", # Depend on the facade to avoid circular dependencies.
":metadata",
"$dir_pw_tokenizer:global_handler_with_payload.facade",
dir_pw_tokenizer,
]
public = [
"public/pw_log_tokenized/log_tokenized.h",
"public_overrides/pw_log_backend/log_backend.h",
]

sources = [ "log_tokenized.cc" ]
}

config("backwards_compatibility_config") {
defines = [ "_PW_LOG_TOKENIZED_GLOBAL_HANDLER_BACKWARDS_COMPAT" ]
visibility = [ ":*" ]
}

# The old pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND backend may still be
# in use by projects that have not switched to the new pw_log_tokenized facade.
# Use the old backend as a stand-in for the new backend if it is set.
_old_backend_is_set = pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND != ""
_new_backend_is_set = pw_log_tokenized_HANDLER_BACKEND != ""

pw_facade("handler") {
public_configs = [ ":public_include_path" ]
public_deps = [
# TODO(hepler): Remove this dependency when all projects have migrated to
# the new pw_log_tokenized handler.
"$dir_pw_tokenizer:global_handler_with_payload",
dir_pw_preprocessor,
]

public = [ "public/pw_log_tokenized/handler.h" ]

# If the global handler backend is set, redirect the new facade to the old
# facade. If no backend is set, the old facade may still be in use through
# link deps, so provide the compatibility layer.
#
# TODO(hepler): Remove these backwards compatibility workarounds when projects
# have migrated.
if (_old_backend_is_set || (!_old_backend_is_set && !_new_backend_is_set)) {
assert(pw_log_tokenized_HANDLER_BACKEND == "",
"pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND is deprecated; " +
"only pw_log_tokenized_HANDLER_BACKEND should be set")

backend = pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND

# There is only one pw_log_tokenized backend in Pigweed, and it has been
# updated to the new API.
if (_old_backend_is_set &&
get_label_info(pw_tokenizer_GLOBAL_HANDLER_WITH_PAYLOAD_BACKEND,
"label_no_toolchain") ==
get_label_info(":base64_over_hdlc", "label_no_toolchain")) {
defines = [ "PW_LOG_TOKENIZED_BACKEND_USES_NEW_API=1" ]
} else {
defines = [ "PW_LOG_TOKENIZED_BACKEND_USES_NEW_API=0" ]
}

public_configs += [ ":backwards_compatibility_config" ]
deps = [ dir_pw_tokenizer ]
sources = [ "compatibility.cc" ]
} else {
backend = pw_log_tokenized_HANDLER_BACKEND
}
}

pw_source_set("metadata") {
Expand All @@ -74,10 +133,11 @@ pw_source_set("config") {
# pw_log is so ubiquitous. These deps are kept separate so they can be
# depended on from elsewhere.
pw_source_set("pw_log_tokenized.impl") {
deps = [
":pw_log_tokenized",
"$dir_pw_tokenizer:global_handler_with_payload",
]
deps = [ ":pw_log_tokenized" ]

if (_new_backend_is_set || _old_backend_is_set) {
deps += [ ":handler" ]
}
}

# This target provides a backend for pw_tokenizer that encodes tokenized logs as
Expand All @@ -87,19 +147,25 @@ pw_source_set("base64_over_hdlc") {
public = [ "public/pw_log_tokenized/base64_over_hdlc.h" ]
sources = [ "base64_over_hdlc.cc" ]
deps = [
":handler.facade",
"$dir_pw_hdlc:encoder",
"$dir_pw_stream:sys_io_stream",
"$dir_pw_tokenizer:base64",
"$dir_pw_tokenizer:global_handler_with_payload.facade",
dir_pw_span,
]
}

pw_test_group("tests") {
tests = [
":log_tokenized_test",
":metadata_test",
]
tests = [ ":metadata_test" ]

# TODO(b/269354373): The Windows MinGW compiler fails to link
# log_tokenized_test.cc because log_tokenized.cc refers to the log handler,
# which is not defined, even though _pw_log_tokenized_EncodeTokenizedLog is
# never called. Remove this check when the Windows build is consistent with
# other builds.
if (current_os != "win" || _new_backend_is_set || _old_backend_is_set) {
tests += [ ":log_tokenized_test" ]
}
}

pw_test("log_tokenized_test") {
Expand Down
21 changes: 17 additions & 4 deletions pw_log_tokenized/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
# the License.

include($ENV{PW_ROOT}/pw_build/pigweed.cmake)
include($ENV{PW_ROOT}/pw_tokenizer/backend.cmake)
include($ENV{PW_ROOT}/pw_log_tokenized/backend.cmake)

pw_add_module_config(pw_log_tokenized_CONFIG)

Expand All @@ -27,7 +27,7 @@ pw_add_library(pw_log_tokenized.config INTERFACE
${pw_log_tokenized_CONFIG}
)

pw_add_library(pw_log_tokenized INTERFACE
pw_add_library(pw_log_tokenized STATIC
HEADERS
public/pw_log_tokenized/log_tokenized.h
public_overrides/pw_log_backend/log_backend.h
Expand All @@ -36,9 +36,11 @@ pw_add_library(pw_log_tokenized INTERFACE
public_overrides
PUBLIC_DEPS
pw_log_tokenized.config
pw_log_tokenized.handler
pw_log_tokenized.metadata
pw_tokenizer
pw_tokenizer.global_handler_with_payload
SOURCES
log_tokenized.cc
)

pw_add_library(pw_log_tokenized.metadata INTERFACE
Expand All @@ -50,6 +52,17 @@ pw_add_library(pw_log_tokenized.metadata INTERFACE
pw_log_tokenized.config
)

pw_add_facade(pw_log_tokenized.handler INTERFACE
BACKEND
pw_log_tokenized.handler_BACKEND
HEADERS
public/pw_log_tokenized/handler.h
PUBLIC_INCLUDES
public
PUBLIC_DEPS
pw_preprocessor
)

# This target provides a backend for pw_tokenizer that encodes tokenized logs as
# Base64, encodes them into HDLC frames, and writes them over sys_io.
pw_add_library(pw_log_tokenized.base64_over_hdlc STATIC
Expand All @@ -61,10 +74,10 @@ pw_add_library(pw_log_tokenized.base64_over_hdlc STATIC
base64_over_hdlc.cc
PRIVATE_DEPS
pw_hdlc.encoder
pw_log_tokenized.handler
pw_span
pw_stream.sys_io_stream
pw_tokenizer.base64
pw_tokenizer.global_handler_with_payload.facade
)

if(NOT "${pw_tokenizer.global_handler_with_payload_BACKEND}" STREQUAL "")
Expand Down
6 changes: 3 additions & 3 deletions pw_tokenizer/backend.cmake → pw_log_tokenized/backend.cmake
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright 2022 The Pigweed Authors
# Copyright 2023 The Pigweed Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
Expand All @@ -15,5 +15,5 @@ include_guard(GLOBAL)

include($ENV{PW_ROOT}/pw_build/pigweed.cmake)

# Backend for the pw_tokenizer:global_handler_with_payload facade.
pw_add_backend_variable(pw_tokenizer.global_handler_with_payload_BACKEND)
# Backend for the pw_log_tokenized handler.
pw_add_backend_variable(pw_log_tokenized.handler_BACKEND)
18 changes: 18 additions & 0 deletions pw_log_tokenized/backend.gni
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Copyright 2023 The Pigweed Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

declare_args() {
# Backend for the pw_log_tokenized log handler.
pw_log_tokenized_HANDLER_BACKEND = ""
}
6 changes: 3 additions & 3 deletions pw_log_tokenized/base64_over_hdlc.cc
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,10 @@
#include "pw_log_tokenized/base64_over_hdlc.h"

#include "pw_hdlc/encoder.h"
#include "pw_log_tokenized/handler.h"
#include "pw_span/span.h"
#include "pw_stream/sys_io_stream.h"
#include "pw_tokenizer/base64.h"
#include "pw_tokenizer/tokenize_to_global_handler_with_payload.h"

namespace pw::log_tokenized {
namespace {
Expand All @@ -31,8 +31,8 @@ stream::SysIoWriter writer;
} // namespace

// Base64-encodes tokenized logs and writes them to pw::sys_io as HDLC frames.
extern "C" void pw_tokenizer_HandleEncodedMessageWithPayload(
pw_tokenizer_Payload, // TODO(hepler): Use the metadata for filtering.
extern "C" void pw_log_tokenized_HandleLog(
uint32_t, // TODO(hepler): Use the metadata for filtering.
const uint8_t log_buffer[],
size_t size_bytes) {
// Encode the tokenized message as Base64.
Expand Down
Loading

0 comments on commit 3e2806b

Please sign in to comment.