Compare commits
105 Commits
Author | SHA1 | Date |
---|---|---|
dependabot[bot] | a7517106e8 | |
Nick Groenen | dba05d0e9c | |
Nick Groenen | 82b6e597c6 | |
Nick Groenen | 90d7dc3cf4 | |
Nick Groenen | 7bbd211732 | |
Nick Groenen | d6f8b4e692 | |
Nick Groenen | ab3fe66d5c | |
Nick Groenen | 151679788a | |
dependabot[bot] | 4b88ad2b51 | |
dependabot[bot] | c7e0196fbe | |
dependabot[bot] | ede7a06c24 | |
dependabot[bot] | f3d063aab4 | |
dependabot[bot] | 06b2a3d6a4 | |
dependabot[bot] | 0606eca6de | |
dependabot[bot] | d3b1bd412e | |
Martin Heuschober | 018c9606a6 | |
dependabot[bot] | eb4c009207 | |
dependabot[bot] | 4e99f03a1f | |
dependabot[bot] | a791273d12 | |
dependabot[bot] | 97958f81c5 | |
Nick Groenen | 8ace49ded3 | |
Nick Groenen | 4c74371b9e | |
Nick Groenen | 5985ad70d6 | |
Nick Groenen | 3d98d65403 | |
Nick Groenen | 06191eb66e | |
Robert Sesek | 43d90d7879 | |
Robert Sesek | cd5dbf6c3b | |
Nick Groenen | c27d7b96b6 | |
Nick Groenen | b38e4d53b5 | |
Nick Groenen | b28e4913ee | |
Nick Groenen | f72ef651c0 | |
dependabot[bot] | 143f4fe1cd | |
dependabot[bot] | 33707d67a5 | |
dependabot[bot] | 88b2378862 | |
dependabot[bot] | 52d400ff01 | |
dependabot[bot] | b67aef7cdc | |
dependabot[bot] | d76bbdb3b7 | |
dependabot[bot] | 0dd235279e | |
dependabot[bot] | e197ac3408 | |
dependabot[bot] | 33c57f2322 | |
dependabot[bot] | ae87431847 | |
dependabot[bot] | 942b954a48 | |
dependabot[bot] | 277e057191 | |
dependabot[bot] | 90573cab3e | |
dependabot[bot] | 29772f8f5c | |
dependabot[bot] | 303e2053be | |
Nick Groenen | cb6abedcad | |
Nick Groenen | 4b636c4402 | |
dependabot[bot] | 80130260e9 | |
dependabot[bot] | 500f0fb86b | |
Nick Groenen | 83ab69aedd | |
Nick Groenen | b5b2ea2c3b | |
Chang-Yen Tseng | c5ba5b7aef | |
dependabot[bot] | 9bce284697 | |
dependabot[bot] | f2a0d1c041 | |
dependabot[bot] | 0659924635 | |
dependabot[bot] | 7a3f278e4b | |
dependabot[bot] | 89ad2d0e66 | |
Nick Groenen | be5cf58c1a | |
Nick Groenen | 6af4c9140c | |
Nick Groenen | 17d0e3df7e | |
Nick Groenen | 262f22ba70 | |
Nick Groenen | 0535de53a9 | |
Nick Groenen | 868f1132bc | |
Nick Groenen | 586530cac8 | |
Nick Groenen | 081eb6c9ab | |
Nick Groenen | d25c6d80c6 | |
dependabot[bot] | 85adc314b6 | |
Nick Groenen | 86af6bbf37 | |
Nick Groenen | 67cd5ac738 | |
Nick Groenen | 84308c9f1f | |
Nick Groenen | 838881fea0 | |
dependabot[bot] | c96acc1d6d | |
dependabot[bot] | 7026b0f684 | |
dependabot[bot] | fe896ddd47 | |
dependabot[bot] | fd346baee9 | |
dependabot[bot] | 740346fa4c | |
dependabot[bot] | e78f19a2fa | |
dependabot[bot] | 9a3ace0070 | |
Nick Groenen | e7486fa962 | |
Nick Groenen | f70d6b0cf0 | |
Narayan Sainaney | c4bc77402e | |
dependabot[bot] | 6d247104c4 | |
Nick Groenen | 7f94766cce | |
dependabot[bot] | 48ec896cee | |
Nick Groenen | 5460368297 | |
Nick Groenen | 8dc7e59a79 | |
Nick Groenen | 6afcd75f07 | |
Nick Groenen | 216179ef35 | |
Nick Groenen | 77e35980c4 | |
Nick Groenen | 481c62b78d | |
dependabot[bot] | 47df3739c5 | |
Nick Groenen | d138c92a25 | |
Nick Groenen | 634b0d70ac | |
Nick Groenen | c64d75967e | |
Nick Groenen | 82798daa89 | |
dependabot[bot] | ff58263707 | |
Nick Groenen | 18231775ae | |
Nick Groenen | 33eac07b1a | |
Nick Groenen | 2dc7809367 | |
dependabot[bot] | cd5b1503da | |
Nick Groenen | 51d263439d | |
Nick Groenen | 5ff990ca20 | |
dependabot[bot] | 9382ca2479 | |
dependabot[bot] | d436727f9f |
312
.gitchangelog.rc
312
.gitchangelog.rc
|
@ -1,312 +0,0 @@
|
|||
# vim: set ft=python
|
||||
##
|
||||
## Format
|
||||
##
|
||||
## ACTION: [AUDIENCE:] COMMIT_MSG [!TAG ...]
|
||||
##
|
||||
## Description
|
||||
##
|
||||
## ACTION is one of 'chg', 'fix', 'new'
|
||||
##
|
||||
## Is WHAT the change is about.
|
||||
##
|
||||
## 'chg' is for refactor, small improvement, cosmetic changes...
|
||||
## 'fix' is for bug fixes
|
||||
## 'new' is for new features, big improvement
|
||||
##
|
||||
## AUDIENCE is optional and one of 'dev', 'usr', 'pkg', 'test', 'doc'
|
||||
##
|
||||
## Is WHO is concerned by the change.
|
||||
##
|
||||
## 'dev' is for developpers (API changes, refactors...)
|
||||
## 'usr' is for final users (UI changes)
|
||||
## 'pkg' is for packagers (packaging changes)
|
||||
## 'test' is for testers (test only related changes)
|
||||
## 'doc' is for doc guys (doc only changes)
|
||||
##
|
||||
## COMMIT_MSG is ... well ... the commit message itself.
|
||||
##
|
||||
## TAGs are additionnal adjective as 'refactor' 'minor' 'cosmetic'
|
||||
##
|
||||
## They are preceded with a '!' or a '@' (prefer the former, as the
|
||||
## latter is wrongly interpreted in github.) Commonly used tags are:
|
||||
##
|
||||
## 'refactor' is obviously for refactoring code only
|
||||
## 'minor' is for a very meaningless change (a typo, adding a comment)
|
||||
## 'cosmetic' is for cosmetic driven change (re-indentation, 80-col...)
|
||||
## 'wip' is for partial functionality but complete subfunctionality.
|
||||
##
|
||||
## Example:
|
||||
##
|
||||
## new: usr: support of bazaar implemented
|
||||
## chg: re-indentend some lines !cosmetic
|
||||
## new: dev: updated code to be compatible with last version of killer lib.
|
||||
## fix: pkg: updated year of licence coverage.
|
||||
## new: test: added a bunch of test around user usability of feature X.
|
||||
## fix: typo in spelling my name in comment. !minor
|
||||
##
|
||||
## Please note that multi-line commit message are supported, and only the
|
||||
## first line will be considered as the "summary" of the commit message. So
|
||||
## tags, and other rules only applies to the summary. The body of the commit
|
||||
## message will be displayed in the changelog without reformatting.
|
||||
|
||||
|
||||
##
|
||||
## ``ignore_regexps`` is a line of regexps
|
||||
##
|
||||
## Any commit having its full commit message matching any regexp listed here
|
||||
## will be ignored and won't be reported in the changelog.
|
||||
##
|
||||
ignore_regexps = [
|
||||
r'!skip_changelog',
|
||||
r'^Release v[0-9]+\.[0-9]+\.[0-9]+$',
|
||||
r'^(.{3,3}\s*:)?\s*[Ii]nitial commit.?\s*$',
|
||||
]
|
||||
|
||||
|
||||
## ``section_regexps`` is a list of 2-tuples associating a string label and a
|
||||
## list of regexp
|
||||
##
|
||||
## Commit messages will be classified in sections thanks to this. Section
|
||||
## titles are the label, and a commit is classified under this section if any
|
||||
## of the regexps associated is matching.
|
||||
##
|
||||
## Please note that ``section_regexps`` will only classify commits and won't
|
||||
## make any changes to the contents. So you'll probably want to go check
|
||||
## ``subject_process`` (or ``body_process``) to do some changes to the subject,
|
||||
## whenever you are tweaking this variable.
|
||||
##
|
||||
section_regexps = [
|
||||
('New', [
|
||||
r'^[nN]ew\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
|
||||
]),
|
||||
('Changes', [
|
||||
r'^[cC]hg\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
|
||||
]),
|
||||
('Fixes', [
|
||||
r'^[fF]ix\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
|
||||
]),
|
||||
|
||||
('Other', None ## Match all lines
|
||||
),
|
||||
|
||||
]
|
||||
|
||||
|
||||
## ``body_process`` is a callable
|
||||
##
|
||||
## This callable will be given the original body and result will
|
||||
## be used in the changelog.
|
||||
##
|
||||
## Available constructs are:
|
||||
##
|
||||
## - any python callable that take one txt argument and return txt argument.
|
||||
##
|
||||
## - ReSub(pattern, replacement): will apply regexp substitution.
|
||||
##
|
||||
## - Indent(chars=" "): will indent the text with the prefix
|
||||
## Please remember that template engines gets also to modify the text and
|
||||
## will usually indent themselves the text if needed.
|
||||
##
|
||||
## - Wrap(regexp=r"\n\n"): re-wrap text in separate paragraph to fill 80-Columns
|
||||
##
|
||||
## - noop: do nothing
|
||||
##
|
||||
## - ucfirst: ensure the first letter is uppercase.
|
||||
## (usually used in the ``subject_process`` pipeline)
|
||||
##
|
||||
## - final_dot: ensure text finishes with a dot
|
||||
## (usually used in the ``subject_process`` pipeline)
|
||||
##
|
||||
## - strip: remove any spaces before or after the content of the string
|
||||
##
|
||||
## - SetIfEmpty(msg="No commit message."): will set the text to
|
||||
## whatever given ``msg`` if the current text is empty.
|
||||
##
|
||||
## Additionally, you can `pipe` the provided filters, for instance:
|
||||
#body_process = Wrap(regexp=r'\n(?=\w+\s*:)') | Indent(chars=" ")
|
||||
#body_process = Wrap(regexp=r'\n(?=\w+\s*:)')
|
||||
#body_process = noop
|
||||
body_process = ReSub(r'((^|\n)[A-Z]\w+(-\w+)*: .*(\n\s+.*)*)+$', r'') | strip
|
||||
|
||||
|
||||
## ``subject_process`` is a callable
|
||||
##
|
||||
## This callable will be given the original subject and result will
|
||||
## be used in the changelog.
|
||||
##
|
||||
## Available constructs are those listed in ``body_process`` doc.
|
||||
subject_process = (strip |
|
||||
ReSub(r'^([cC]hg|[fF]ix|[nN]ew)\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n@]*)(@[a-z]+\s+)*$', r'\4') |
|
||||
SetIfEmpty("No commit message.") | ucfirst | final_dot)
|
||||
|
||||
|
||||
## ``tag_filter_regexp`` is a regexp
|
||||
##
|
||||
## Tags that will be used for the changelog must match this regexp.
|
||||
##
|
||||
tag_filter_regexp = r'^v[0-9]+\.[0-9]+(\.[0-9]+)?$'
|
||||
|
||||
|
||||
## ``unreleased_version_label`` is a string or a callable that outputs a string
|
||||
##
|
||||
## This label will be used as the changelog Title of the last set of changes
|
||||
## between last valid tag and HEAD if any.
|
||||
unreleased_version_label = "(unreleased)"
|
||||
|
||||
|
||||
## ``output_engine`` is a callable
|
||||
##
|
||||
## This will change the output format of the generated changelog file
|
||||
##
|
||||
## Available choices are:
|
||||
##
|
||||
## - rest_py
|
||||
##
|
||||
## Legacy pure python engine, outputs ReSTructured text.
|
||||
## This is the default.
|
||||
##
|
||||
## - mustache(<template_name>)
|
||||
##
|
||||
## Template name could be any of the available templates in
|
||||
## ``templates/mustache/*.tpl``.
|
||||
## Requires python package ``pystache``.
|
||||
## Examples:
|
||||
## - mustache("markdown")
|
||||
## - mustache("restructuredtext")
|
||||
##
|
||||
## - makotemplate(<template_name>)
|
||||
##
|
||||
## Template name could be any of the available templates in
|
||||
## ``templates/mako/*.tpl``.
|
||||
## Requires python package ``mako``.
|
||||
## Examples:
|
||||
## - makotemplate("restructuredtext")
|
||||
##
|
||||
# output_engine = rest_py
|
||||
#output_engine = mustache("restructuredtext")
|
||||
output_engine = mustache("markdown")
|
||||
#output_engine = makotemplate("restructuredtext")
|
||||
|
||||
|
||||
## ``include_merge`` is a boolean
|
||||
##
|
||||
## This option tells git-log whether to include merge commits in the log.
|
||||
## The default is to include them.
|
||||
include_merge = False
|
||||
|
||||
|
||||
## ``log_encoding`` is a string identifier
|
||||
##
|
||||
## This option tells gitchangelog what encoding is outputed by ``git log``.
|
||||
## The default is to be clever about it: it checks ``git config`` for
|
||||
## ``i18n.logOutputEncoding``, and if not found will default to git's own
|
||||
## default: ``utf-8``.
|
||||
log_encoding = 'utf-8'
|
||||
|
||||
|
||||
## ``publish`` is a callable
|
||||
##
|
||||
## Sets what ``gitchangelog`` should do with the output generated by
|
||||
## the output engine. ``publish`` is a callable taking one argument
|
||||
## that is an interator on lines from the output engine.
|
||||
##
|
||||
## Some helper callable are provided:
|
||||
##
|
||||
## Available choices are:
|
||||
##
|
||||
## - stdout
|
||||
##
|
||||
## Outputs directly to standard output
|
||||
## (This is the default)
|
||||
##
|
||||
## - FileInsertAtFirstRegexMatch(file, pattern, idx=lamda m: m.start())
|
||||
##
|
||||
## Creates a callable that will parse given file for the given
|
||||
## regex pattern and will insert the output in the file.
|
||||
## ``idx`` is a callable that receive the matching object and
|
||||
## must return a integer index point where to insert the
|
||||
## the output in the file. Default is to return the position of
|
||||
## the start of the matched string.
|
||||
##
|
||||
## - FileRegexSubst(file, pattern, replace, flags)
|
||||
##
|
||||
## Apply a replace inplace in the given file. Your regex pattern must
|
||||
## take care of everything and might be more complex. Check the README
|
||||
## for a complete copy-pastable example.
|
||||
##
|
||||
# publish = FileInsertIntoFirstRegexMatch(
|
||||
# "CHANGELOG.rst",
|
||||
# r'/(?P<rev>[0-9]+\.[0-9]+(\.[0-9]+)?)\s+\([0-9]+-[0-9]{2}-[0-9]{2}\)\n--+\n/',
|
||||
# idx=lambda m: m.start(1)
|
||||
# )
|
||||
#publish = stdout
|
||||
OUTPUT_FILE = "CHANGES.md"
|
||||
INSERT_POINT_REGEX = r'''(?isxu)
|
||||
^
|
||||
(
|
||||
\s*\#\s+Changelog\s*(\n|\r\n|\r) ## ``Changelog`` line
|
||||
)
|
||||
|
||||
( ## Match all between changelog and release rev
|
||||
(
|
||||
(?!
|
||||
(?<=(\n|\r)) ## look back for newline
|
||||
\#\#\s+%(rev)s ## revision
|
||||
\s+
|
||||
\([0-9]+-[0-9]{2}-[0-9]{2}\)(\n|\r\n|\r) ## date
|
||||
)
|
||||
.
|
||||
)*
|
||||
)
|
||||
|
||||
(?P<tail>\#\#\s+(?P<rev>%(rev)s))
|
||||
''' % {'rev': r"v[0-9]+\.[0-9]+(\.[0-9]+)?"}
|
||||
|
||||
revs = [
|
||||
Caret(FileFirstRegexMatch(OUTPUT_FILE, INSERT_POINT_REGEX)),
|
||||
"HEAD"
|
||||
]
|
||||
|
||||
publish = FileRegexSubst(OUTPUT_FILE, INSERT_POINT_REGEX, r"\1\o\n\g<tail>")
|
||||
|
||||
|
||||
## ``revs`` is a list of callable or a list of string
|
||||
##
|
||||
## callable will be called to resolve as strings and allow dynamical
|
||||
## computation of these. The result will be used as revisions for
|
||||
## gitchangelog (as if directly stated on the command line). This allows
|
||||
## to filter exaclty which commits will be read by gitchangelog.
|
||||
##
|
||||
## To get a full documentation on the format of these strings, please
|
||||
## refer to the ``git rev-list`` arguments. There are many examples.
|
||||
##
|
||||
## Using callables is especially useful, for instance, if you
|
||||
## are using gitchangelog to generate incrementally your changelog.
|
||||
##
|
||||
## Some helpers are provided, you can use them::
|
||||
##
|
||||
## - FileFirstRegexMatch(file, pattern): will return a callable that will
|
||||
## return the first string match for the given pattern in the given file.
|
||||
## If you use named sub-patterns in your regex pattern, it'll output only
|
||||
## the string matching the regex pattern named "rev".
|
||||
##
|
||||
## - Caret(rev): will return the rev prefixed by a "^", which is a
|
||||
## way to remove the given revision and all its ancestor.
|
||||
##
|
||||
## Please note that if you provide a rev-list on the command line, it'll
|
||||
## replace this value (which will then be ignored).
|
||||
##
|
||||
## If empty, then ``gitchangelog`` will act as it had to generate a full
|
||||
## changelog.
|
||||
##
|
||||
## The default is to use all commits to make the changelog.
|
||||
#revs = ["^1.0.3", ]
|
||||
#revs = [
|
||||
# Caret(
|
||||
# FileFirstRegexMatch(
|
||||
# "CHANGELOG.rst",
|
||||
# r"(?P<rev>[0-9]+\.[0-9]+(\.[0-9]+)?)\s+\([0-9]+-[0-9]{2}-[0-9]{2}\)\n--+\n")),
|
||||
# "HEAD"
|
||||
#]
|
||||
# revs = []
|
|
@ -0,0 +1,19 @@
|
|||
name: Publish to crates.io
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
plan:
|
||||
required: true
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
publish:
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
PLAN: ${{ inputs.plan }}
|
||||
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: dtolnay/rust-toolchain@stable
|
||||
- run: cargo publish
|
|
@ -1,125 +1,211 @@
|
|||
# Copyright 2022-2023, axodotdev
|
||||
# SPDX-License-Identifier: MIT or Apache-2.0
|
||||
#
|
||||
# CI that:
|
||||
#
|
||||
# * checks for a Git Tag that looks like a release
|
||||
# * builds artifacts with cargo-dist (archives, installers, hashes)
|
||||
# * uploads those artifacts to temporary workflow zip
|
||||
# * on success, uploads the artifacts to a Github Release™
|
||||
#
|
||||
# Note that the Github Release™ will be created with a generated
|
||||
# title/body based on your changelogs.
|
||||
name: Release
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
# This task will run whenever you push a git tag that looks like a version
|
||||
# like "1.0.0", "v0.1.0-prerelease.1", "my-app/0.1.0", "releases/v1.0.0", etc.
|
||||
# Various formats will be parsed into a VERSION and an optional PACKAGE_NAME, where
|
||||
# PACKAGE_NAME must be the name of a Cargo package in your workspace, and VERSION
|
||||
# must be a Cargo-style SemVer Version (must have at least major.minor.patch).
|
||||
#
|
||||
# If PACKAGE_NAME is specified, then the release will be for that
|
||||
# package (erroring out if it doesn't have the given version or isn't cargo-dist-able).
|
||||
#
|
||||
# If PACKAGE_NAME isn't specified, then the release will be for all
|
||||
# (cargo-dist-able) packages in the workspace with that version (this mode is
|
||||
# intended for workspaces with only one dist-able package, or with all dist-able
|
||||
# packages versioned/released in lockstep).
|
||||
#
|
||||
# If you push multiple tags at once, separate instances of this workflow will
|
||||
# spin up, creating an independent Github Release™ for each one. However Github
|
||||
# will hard limit this to 3 tags per commit, as it will assume more tags is a
|
||||
# mistake.
|
||||
#
|
||||
# If there's a prerelease-style suffix to the version, then the Github Release™
|
||||
# will be marked as a prerelease.
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- "v*"
|
||||
|
||||
name: Create release
|
||||
- '**[0-9]+.[0-9]+.[0-9]+*'
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
create-release:
|
||||
name: Create release
|
||||
# Run 'cargo dist plan' to determine what tasks we need to do
|
||||
plan:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
upload_url: "${{ steps.create_release.outputs.upload_url }}"
|
||||
val: ${{ steps.plan.outputs.manifest }}
|
||||
tag: ${{ !github.event.pull_request && github.ref_name || '' }}
|
||||
tag-flag: ${{ !github.event.pull_request && format('--tag={0}', github.ref_name) || '' }}
|
||||
publishing: ${{ !github.event.pull_request }}
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- id: create_release
|
||||
uses: actions/create-release@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
tag_name: ${{ github.ref }}
|
||||
release_name: ${{ github.ref }}
|
||||
draft: true
|
||||
prerelease: false
|
||||
submodules: recursive
|
||||
- name: Install cargo-dist
|
||||
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/axodotdev/cargo-dist/releases/download/v0.4.3/cargo-dist-installer.sh | sh"
|
||||
- id: plan
|
||||
run: |
|
||||
cargo dist plan ${{ !github.event.pull_request && format('--tag={0}', github.ref_name) || '' }} --output-format=json > dist-manifest.json
|
||||
echo "cargo dist plan ran successfully"
|
||||
cat dist-manifest.json
|
||||
echo "manifest=$(jq -c "." dist-manifest.json)" >> "$GITHUB_OUTPUT"
|
||||
- name: "Upload dist-manifest.json"
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: artifacts
|
||||
path: dist-manifest.json
|
||||
|
||||
build-linux:
|
||||
name: Linux binary
|
||||
needs: create-release
|
||||
# Build and packages all the platform-specific things
|
||||
upload-local-artifacts:
|
||||
# Let the initial task tell us to not run (currently very blunt)
|
||||
needs: plan
|
||||
if: ${{ fromJson(needs.plan.outputs.val).releases != null && (needs.plan.outputs.publishing == 'true' || fromJson(needs.plan.outputs.val).ci.github.pr_run_mode == 'upload') }}
|
||||
strategy:
|
||||
fail-fast: false
|
||||
# Target platforms/runners are computed by cargo-dist in create-release.
|
||||
# Each member of the matrix has the following arguments:
|
||||
#
|
||||
# - runner: the github runner
|
||||
# - dist-args: cli flags to pass to cargo dist
|
||||
# - install-dist: expression to run to install cargo-dist on the runner
|
||||
#
|
||||
# Typically there will be:
|
||||
# - 1 "global" task that builds universal installers
|
||||
# - N "local" tasks that build each platform's binaries and platform-specific installers
|
||||
matrix: ${{ fromJson(needs.plan.outputs.val).ci.github.artifacts_matrix }}
|
||||
runs-on: ${{ matrix.runner }}
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
BUILD_MANIFEST_NAME: target/distrib/${{ join(matrix.targets, '-') }}-dist-manifest.json
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
- uses: swatinem/rust-cache@v2
|
||||
- name: Install cargo-dist
|
||||
run: ${{ matrix.install_dist }}
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
${{ matrix.packages_install }}
|
||||
- name: Build artifacts
|
||||
run: |
|
||||
# Actually do builds and make zips and whatnot
|
||||
cargo dist build ${{ needs.plan.outputs.tag-flag }} --print=linkage --output-format=json ${{ matrix.dist_args }} > dist-manifest.json
|
||||
echo "cargo dist ran successfully"
|
||||
- id: cargo-dist
|
||||
name: Post-build
|
||||
# We force bash here just because github makes it really hard to get values up
|
||||
# to "real" actions without writing to env-vars, and writing to env-vars has
|
||||
# inconsistent syntax between shell and powershell.
|
||||
shell: bash
|
||||
run: |
|
||||
# Parse out what we just built and upload it to the Github Release™
|
||||
echo "paths<<EOF" >> "$GITHUB_OUTPUT"
|
||||
jq --raw-output ".artifacts[]?.path | select( . != null )" dist-manifest.json >> "$GITHUB_OUTPUT"
|
||||
echo "EOF" >> "$GITHUB_OUTPUT"
|
||||
|
||||
cp dist-manifest.json "$BUILD_MANIFEST_NAME"
|
||||
- name: "Upload artifacts"
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: artifacts
|
||||
path: |
|
||||
${{ steps.cargo-dist.outputs.paths }}
|
||||
${{ env.BUILD_MANIFEST_NAME }}
|
||||
|
||||
# Build and package all the platform-agnostic(ish) things
|
||||
upload-global-artifacts:
|
||||
needs: [plan, upload-local-artifacts]
|
||||
runs-on: "ubuntu-20.04"
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
- name: Install cargo-dist
|
||||
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/axodotdev/cargo-dist/releases/download/v0.4.3/cargo-dist-installer.sh | sh"
|
||||
# Get all the local artifacts for the global tasks to use (for e.g. checksums)
|
||||
- name: Fetch local artifacts
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: artifacts
|
||||
path: target/distrib/
|
||||
- id: cargo-dist
|
||||
shell: bash
|
||||
run: |
|
||||
cargo dist build ${{ needs.plan.outputs.tag-flag }} --output-format=json "--artifacts=global" > dist-manifest.json
|
||||
echo "cargo dist ran successfully"
|
||||
|
||||
# Parse out what we just built and upload it to the Github Release™
|
||||
echo "paths<<EOF" >> "$GITHUB_OUTPUT"
|
||||
jq --raw-output ".artifacts[]?.path | select( . != null )" dist-manifest.json >> "$GITHUB_OUTPUT"
|
||||
echo "EOF" >> "$GITHUB_OUTPUT"
|
||||
- name: "Upload artifacts"
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: artifacts
|
||||
path: ${{ steps.cargo-dist.outputs.paths }}
|
||||
|
||||
should-publish:
|
||||
needs:
|
||||
- plan
|
||||
- upload-local-artifacts
|
||||
- upload-global-artifacts
|
||||
if: ${{ needs.plan.outputs.publishing == 'true' }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
- name: print tag
|
||||
run: echo "ok we're publishing!"
|
||||
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: build
|
||||
args: --release --locked
|
||||
custom-publish-crate:
|
||||
needs: [plan, should-publish]
|
||||
if: ${{ !fromJson(needs.plan.outputs.val).announcement_is_prerelease || fromJson(needs.plan.outputs.val).publish_prereleases }}
|
||||
uses: ./.github/workflows/publish-crate.yml
|
||||
with:
|
||||
plan: ${{ needs.plan.outputs.val }}
|
||||
secrets: inherit
|
||||
|
||||
- run: strip target/release/obsidian-export
|
||||
|
||||
- uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: Linux binary
|
||||
path: target/release/obsidian-export
|
||||
retention-days: 7
|
||||
|
||||
- uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.create-release.outputs.upload_url }}
|
||||
asset_path: target/release/obsidian-export
|
||||
asset_name: obsidian-export_Linux-x86_64
|
||||
asset_content_type: application/octet-stream
|
||||
|
||||
build-windows:
|
||||
name: Windows binary
|
||||
needs: create-release
|
||||
runs-on: windows-latest
|
||||
# Create a Github Release with all the results once everything is done
|
||||
publish-release:
|
||||
needs: [plan, should-publish]
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
|
||||
- uses: actions-rs/cargo@v1
|
||||
submodules: recursive
|
||||
- name: "Download artifacts"
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
command: build
|
||||
args: --release --locked
|
||||
|
||||
- run: strip target/release/obsidian-export.exe
|
||||
|
||||
- uses: actions/upload-artifact@v2
|
||||
name: artifacts
|
||||
path: artifacts
|
||||
- name: Cleanup
|
||||
run: |
|
||||
# Remove the granular manifests
|
||||
rm artifacts/*-dist-manifest.json
|
||||
- name: Create Release
|
||||
uses: ncipollo/release-action@v1
|
||||
with:
|
||||
name: Windows binary
|
||||
path: target/release/obsidian-export.exe
|
||||
retention-days: 7
|
||||
|
||||
- uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.create-release.outputs.upload_url }}
|
||||
asset_path: target/release/obsidian-export.exe
|
||||
asset_name: obsidian-export_Windows-x64_64
|
||||
asset_content_type: application/octet-stream
|
||||
|
||||
build-macos:
|
||||
name: Mac OS binary
|
||||
needs: create-release
|
||||
runs-on: macos-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: build
|
||||
args: --release --locked
|
||||
|
||||
- run: strip target/release/obsidian-export
|
||||
|
||||
- uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: MacOS binary
|
||||
path: target/release/obsidian-export
|
||||
retention-days: 7
|
||||
|
||||
- uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.create-release.outputs.upload_url }}
|
||||
asset_path: target/release/obsidian-export
|
||||
asset_name: obsidian-export_MacOS-x86_64
|
||||
asset_content_type: application/octet-stream
|
||||
tag: ${{ needs.plan.outputs.tag }}
|
||||
name: ${{ fromJson(needs.plan.outputs.val).announcement_title }}
|
||||
body: ${{ fromJson(needs.plan.outputs.val).announcement_github_body }}
|
||||
prerelease: ${{ fromJson(needs.plan.outputs.val).announcement_is_prerelease }}
|
||||
artifacts: "artifacts/*"
|
||||
|
|
|
@ -1,80 +1,184 @@
|
|||
name: CI tests
|
||||
on: [push, pull_request]
|
||||
|
||||
name: CI tests
|
||||
env:
|
||||
SCCACHE_GHA_ENABLED: "true"
|
||||
RUSTC_WRAPPER: "sccache"
|
||||
|
||||
jobs:
|
||||
linting:
|
||||
build:
|
||||
name: Build project
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
rustc_cache_key: ${{ steps.setup_rust.outputs.cachekey }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: dtolnay/rust-toolchain@stable
|
||||
id: setup_rust
|
||||
with:
|
||||
components: "rustfmt, clippy"
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-base-${{ steps.setup_rust.outputs.cachekey }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
- run: cargo build --locked --all-targets
|
||||
|
||||
lint:
|
||||
name: Run lints
|
||||
runs-on: ubuntu-latest
|
||||
needs: build
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-lint-${{ needs.build.outputs.rustc_cache_key }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-lint-${{ env.RUSTC_CACHEKEY }}
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
fail-on-cache-miss: true
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
|
||||
- run: rustup component add rustfmt
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: fmt
|
||||
args: --all -- --check
|
||||
- run: cargo fmt --all -- --check
|
||||
- run: cargo check
|
||||
- run: cargo clippy -- -D warnings
|
||||
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: check
|
||||
|
||||
- run: rustup component add clippy
|
||||
- uses: actions-rs/cargo@v1
|
||||
pre-commit:
|
||||
name: Run pre-commit
|
||||
runs-on: ubuntu-latest
|
||||
needs: build
|
||||
env:
|
||||
# These hooks are expensive and already run as dedicated jobs above
|
||||
SKIP: "tests,clippy"
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
command: clippy
|
||||
args: -- -D warnings
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-lint-${{ needs.build.outputs.rustc_cache_key }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-lint-${{ env.RUSTC_CACHEKEY }}
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
fail-on-cache-miss: true
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
|
||||
- uses: actions/setup-python@v5
|
||||
- name: set PYVERSION
|
||||
run: echo "PYVERSION=$(python --version | tr ' ' '-')" >> $GITHUB_ENV
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
path: ~/.cache/pre-commit
|
||||
# Changes to pre-commit-config.yaml may require the installation of
|
||||
# new binaries/scripts. When a cache hit occurs, changes to the cache
|
||||
# aren't persisted at the end of the run, so making the key dependent
|
||||
# on the configuration file ensures we always persist a complete cache.
|
||||
key: pre-commit-${{ env.PYVERSION }}-${{ hashFiles('.pre-commit-config.yaml') }}
|
||||
|
||||
- run: pip install pre-commit
|
||||
- run: pre-commit run --all --color=always --show-diff-on-failure
|
||||
|
||||
test-linux:
|
||||
name: Test on Linux
|
||||
runs-on: ubuntu-latest
|
||||
needs: build
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: test
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-test-${{ needs.build.outputs.rustc_cache_key }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-test-${{ env.RUSTC_CACHEKEY }}
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
fail-on-cache-miss: true
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
|
||||
- run: cargo test
|
||||
|
||||
test-windows:
|
||||
name: Test on Windows
|
||||
runs-on: windows-latest
|
||||
needs: build
|
||||
steps:
|
||||
- run: git config --system core.autocrlf false && git config --system core.eol lf
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
- uses: actions/checkout@v4
|
||||
- uses: dtolnay/rust-toolchain@stable
|
||||
id: setup_rust
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
- uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: test
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-windows-${{ needs.build.outputs.rustc_cache_key }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-windows-${{ env.RUSTC_CACHEKEY }}
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
fail-on-cache-miss: true
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
|
||||
- run: cargo test
|
||||
|
||||
coverage:
|
||||
name: Code coverage
|
||||
runs-on: ubuntu-latest
|
||||
needs: build
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions-rs/toolchain@v1
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/cache@v3
|
||||
with:
|
||||
profile: minimal
|
||||
toolchain: stable
|
||||
override: true
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
target/
|
||||
key: "cargo-coverage-${{ needs.build.outputs.rustc_cache_key }}-${{ hashFiles('**/Cargo.lock') }}"
|
||||
restore-keys: |
|
||||
cargo-coverage-${{ env.RUSTC_CACHEKEY }}
|
||||
cargo-base-${{ env.RUSTC_CACHEKEY }}
|
||||
fail-on-cache-miss: true
|
||||
- name: Run sccache-cache
|
||||
uses: mozilla-actions/sccache-action@v0.0.3
|
||||
|
||||
- uses: actions-rs/tarpaulin@v0.1
|
||||
with:
|
||||
version: "latest"
|
||||
# Constrained by https://github.com/actions-rs/tarpaulin/pull/23
|
||||
version: "0.22.0"
|
||||
args: "--ignore-tests"
|
||||
out-type: "Html"
|
||||
- uses: actions/upload-artifact@v2
|
||||
- uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: tarpaulin-report
|
||||
path: tarpaulin-report.html
|
||||
|
|
|
@ -2,18 +2,34 @@
|
|||
# See https://pre-commit.com/hooks.html for more hooks
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: 9136088a246768144165fcc3ecc3d31bb686920a # frozen: v3.3.0
|
||||
rev: 38b88246ccc552bffaaf54259d064beeee434539 # frozen: v4.0.1
|
||||
hooks:
|
||||
- id: check-case-conflict
|
||||
- id: check-symlinks
|
||||
- id: check-yaml
|
||||
- repo: https://github.com/doublify/pre-commit-rust
|
||||
rev: eeee35a89e69d5772bdee97db1a6a898467b686e # frozen: v1.0
|
||||
hooks:
|
||||
- id: fmt
|
||||
- id: cargo-check
|
||||
- id: clippy
|
||||
args: ["--", "-D", "warnings"]
|
||||
- id: end-of-file-fixer
|
||||
- id: mixed-line-ending
|
||||
- id: trailing-whitespace
|
||||
exclude: '^(README.md|tests/testdata/expected/.*)$'
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: rustfmt
|
||||
name: Check formatting
|
||||
entry: cargo fmt --
|
||||
language: system
|
||||
files: \.rs$
|
||||
- id: tests
|
||||
name: Run tests
|
||||
entry: cargo test
|
||||
language: system
|
||||
files: \.rs$
|
||||
pass_filenames: false
|
||||
- id: clippy
|
||||
name: Check clippy lints
|
||||
entry: cargo clippy -- -D warnings
|
||||
language: system
|
||||
files: \.rs$
|
||||
pass_filenames: false
|
||||
- id: README
|
||||
name: Render README.md
|
||||
entry: docs/generate.sh
|
||||
|
|
|
@ -0,0 +1,800 @@
|
|||
# Changelog
|
||||
|
||||
## v23.12.0 (2023-12-03)
|
||||
|
||||
### New
|
||||
|
||||
- Implement frontmatter based filtering (#163) [Martin Heuschober]
|
||||
|
||||
This allows limiting the notes that will be exported using `--skip-tags` and `--only-tags`:
|
||||
|
||||
- using `--skip-tags foo --skip-tags bar` will skip any files that have the tags `foo` or `bar` in their frontmatter
|
||||
- using `--only-tags foo --only-tags bar` will skip any files that **don't** have the tags `foo` or `bar` in their frontmatter
|
||||
|
||||
### Fixes
|
||||
|
||||
- Trim filenames while resolving wikilinks [Nick Groenen]
|
||||
|
||||
Obsidian trims the filename part in a [[WikiLink|label]], so each of
|
||||
these are equivalent:
|
||||
|
||||
```
|
||||
[[wikilink]]
|
||||
[[ wikilink ]]
|
||||
[[ wikilink |wikilink]]
|
||||
```
|
||||
|
||||
Obsidian-export now behaves similarly.
|
||||
|
||||
Fixes #188
|
||||
|
||||
### Other
|
||||
|
||||
- Relicense to BSD-2-Clause Plus Patent License [Nick Groenen]
|
||||
|
||||
This license achieves everything that dual-licensing under MIT + Apache
|
||||
aims for, but without the weirdness of being under two licenses.
|
||||
|
||||
Having checked external contributions, I feel pretty confident that I
|
||||
can unilaterally make this license change, as people have only
|
||||
contributed a handful of one-line changes of no significance towards
|
||||
copyrighted work up to this point.
|
||||
|
||||
|
||||
- Add a lifetime annotation to the Postprocesor type [Robert Sesek]
|
||||
|
||||
This lets the compiler reason about the lifetimes of objects used by the
|
||||
postprocessor, if the callback captures variables.
|
||||
|
||||
See zoni/obsidian-export#175
|
||||
|
||||
- Use cargo-dist to create release artifacts [Nick Groenen]
|
||||
|
||||
This will create binaries for more platforms (including ARM builds for
|
||||
MacOS) and installer scripts in addition to just the binaries themselves.
|
||||
|
||||
## v22.11.0 (2022-11-19)
|
||||
|
||||
### New
|
||||
|
||||
* Apply unicode normalization while resolving notes. [Nick Groenen]
|
||||
|
||||
The unicode standard allows for certain (visually) identical characters to
|
||||
be represented in different ways.
|
||||
|
||||
For example the character ä may be represented as a single combined
|
||||
codepoint "Latin Small Letter A with Diaeresis" (U+00E4) or by the
|
||||
combination of "Latin Small Letter A" (U+0061) followed by "Combining
|
||||
Diaeresis" (U+0308).
|
||||
|
||||
When encoded with UTF-8, these are represented as respectively the two
|
||||
bytes 0xC3 0xA4, and the three bytes 0x61 0xCC 0x88.
|
||||
|
||||
A user linking to notes with these characters in their titles would
|
||||
expect these two variants to link to the same file, given they are
|
||||
visually identical and have the exact same semantic meaning.
|
||||
|
||||
The unicode standard defines a method to deconstruct and normalize these
|
||||
forms, so that a byte comparison on the normalized forms of these
|
||||
variants ends up comparing the same thing. This is called Unicode
|
||||
Normalization, defined in Unicode® Standard Annex #15
|
||||
(http://www.unicode.org/reports/tr15/).
|
||||
|
||||
The W3C Working Group has written an excellent explanation of the
|
||||
problems regarding string matching, and how unicode normalization helps
|
||||
with this process: https://www.w3.org/TR/charmod-norm/#unicodeNormalization
|
||||
|
||||
With this change, obsidian-export will perform unicode normalization
|
||||
(specifically the C (or NFC) normalization form) on all note titles
|
||||
while looking up link references, ensuring visually identical links are
|
||||
treated as being similar, even if they were encoded as different
|
||||
variants.
|
||||
|
||||
A special thanks to Hans Raaf (@oderwat) for reporting and helping track
|
||||
down this issue.
|
||||
|
||||
### Breaking Changes (affects library API only)
|
||||
|
||||
* Pass context and events as mutable references to postprocessors. [Nick Groenen]
|
||||
|
||||
Instead of passing clones of context and the markdown tree to
|
||||
postprocessors, pass them a mutable reference which may be modified
|
||||
in-place.
|
||||
|
||||
This is a breaking change to the postprocessor implementation, changing
|
||||
both the input arguments as well as the return value:
|
||||
|
||||
```diff
|
||||
- dyn Fn(Context, MarkdownEvents) -> (Context, MarkdownEvents, PostprocessorResult) + Send + Sync;
|
||||
+ dyn Fn(&mut Context, &mut MarkdownEvents) -> PostprocessorResult + Send + Sync;
|
||||
```
|
||||
|
||||
With this change the postprocessor API becomes a little more ergonomic
|
||||
to use however, especially making the intent around return statements more clear.
|
||||
|
||||
### Other
|
||||
|
||||
* Use path.Join to construct hugo links (#92) [Chang-Yen Tseng]
|
||||
|
||||
Use path.Join so that it will render correctly on Windows
|
||||
(path.Join will convert Windows backslash to forward slash)
|
||||
|
||||
* Bump crossbeam-utils from 0.8.5 to 0.8.12. [dependabot[bot]]
|
||||
|
||||
Bumps [crossbeam-utils](https://github.com/crossbeam-rs/crossbeam) from 0.8.5 to 0.8.12.
|
||||
- [Release notes](https://github.com/crossbeam-rs/crossbeam/releases)
|
||||
- [Changelog](https://github.com/crossbeam-rs/crossbeam/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/crossbeam-rs/crossbeam/compare/crossbeam-utils-0.8.5...crossbeam-utils-0.8.12)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: crossbeam-utils
|
||||
dependency-type: indirect
|
||||
...
|
||||
|
||||
* Bump regex from 1.6.0 to 1.7.0. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.6.0 to 1.7.0.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.6.0...1.7.0)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: regex
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-minor
|
||||
...
|
||||
|
||||
* Bump actions/checkout from 2 to 3. [dependabot[bot]]
|
||||
|
||||
Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 3.
|
||||
- [Release notes](https://github.com/actions/checkout/releases)
|
||||
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/actions/checkout/compare/v2...v3)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: actions/checkout
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-major
|
||||
...
|
||||
|
||||
* Bump actions/upload-artifact from 2 to 3. [dependabot[bot]]
|
||||
|
||||
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 2 to 3.
|
||||
- [Release notes](https://github.com/actions/upload-artifact/releases)
|
||||
- [Commits](https://github.com/actions/upload-artifact/compare/v2...v3)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: actions/upload-artifact
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-major
|
||||
...
|
||||
|
||||
* Bump thread_local from 1.1.3 to 1.1.4. [dependabot[bot]]
|
||||
|
||||
Bumps [thread_local](https://github.com/Amanieu/thread_local-rs) from 1.1.3 to 1.1.4.
|
||||
- [Release notes](https://github.com/Amanieu/thread_local-rs/releases)
|
||||
- [Commits](https://github.com/Amanieu/thread_local-rs/compare/v1.1.3...1.1.4)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: thread_local
|
||||
dependency-type: indirect
|
||||
...
|
||||
|
||||
* Remove needless borrows. [Nick Groenen]
|
||||
|
||||
* Upgrade snafu to 0.7.x. [Nick Groenen]
|
||||
|
||||
* Upgrade pulldown-cmark-to-cmark to 10.0.x. [Nick Groenen]
|
||||
|
||||
* Upgrade serde_yaml to 0.9.x. [Nick Groenen]
|
||||
|
||||
* Upgrade minor dependencies. [Nick Groenen]
|
||||
|
||||
* Fix new clippy lints. [Nick Groenen]
|
||||
|
||||
* Add a contributor guide. [Nick Groenen]
|
||||
|
||||
* Simplify pre-commit setup. [Nick Groenen]
|
||||
|
||||
No need to depend on a third-party hook repository when each of these
|
||||
checks is easily defined and run through system commands.
|
||||
|
||||
This also allows us to actually run tests, which is current unsupported
|
||||
(https://github.com/doublify/pre-commit-rust/pull/19)
|
||||
|
||||
* Bump tempfile from 3.2.0 to 3.3.0. [dependabot[bot]]
|
||||
|
||||
Bumps [tempfile](https://github.com/Stebalien/tempfile) from 3.2.0 to 3.3.0.
|
||||
- [Release notes](https://github.com/Stebalien/tempfile/releases)
|
||||
- [Changelog](https://github.com/Stebalien/tempfile/blob/master/NEWS)
|
||||
- [Commits](https://github.com/Stebalien/tempfile/compare/v3.2.0...v3.3.0)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: tempfile
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-minor
|
||||
...
|
||||
|
||||
## v22.1.0 (2022-01-02)
|
||||
|
||||
Happy new year! On this second day of 2022 comes a fresh release with one
|
||||
notable new feature.
|
||||
|
||||
### New
|
||||
|
||||
* Support Obsidian's "Strict line breaks" setting. [Nick Groenen]
|
||||
|
||||
This change introduces a new `--hard-linebreaks` CLI argument. When
|
||||
used, this converts soft line breaks to hard line breaks, mimicking
|
||||
Obsidian's "Strict line breaks" setting.
|
||||
|
||||
> Implementation detail: I considered naming this flag
|
||||
> `--strict-line-breaks` to be consistent with Obsidian itself, however I
|
||||
> feel the name is somewhat misleading and ill-chosen.
|
||||
|
||||
### Other
|
||||
|
||||
* Give release binaries file extensions. [Nick Groenen]
|
||||
|
||||
This may make it more clear to users that these are precompiled, binary
|
||||
files. This is especially relevant on Windows, where the convention is
|
||||
that executable files have a `.exe` extension, as seen in #49.
|
||||
|
||||
* Upgrade dependencies. [Nick Groenen]
|
||||
|
||||
This commit upgrades all dependencies to their current latest versions. Most
|
||||
notably, this includes upgrades to the following most critical libraries:
|
||||
|
||||
pulldown-cmark v0.8.0 -> v0.9.0
|
||||
pulldown-cmark-to-cmark v7.1.1 -> v9.0.0
|
||||
|
||||
In total, these dependencies were upgraded:
|
||||
|
||||
bstr v0.2.16 -> v0.2.17
|
||||
ignore v0.4.17 -> v0.4.18
|
||||
libc v0.2.101 -> v0.2.112
|
||||
memoffset v0.6.4 -> v0.6.5
|
||||
num_cpus v1.13.0 -> v1.13.1
|
||||
once_cell v1.8.0 -> v1.9.0
|
||||
ppv-lite86 v0.2.10 -> v0.2.16
|
||||
proc-macro2 v1.0.29 -> v1.0.36
|
||||
pulldown-cmark v0.8.0 -> v0.9.0
|
||||
pulldown-cmark-to-cmark v7.1.1 -> v9.0.0
|
||||
quote v1.0.9 -> v1.0.14
|
||||
rayon v1.5.0 -> v1.5.1
|
||||
regex v1.5.3 -> v1.5.4
|
||||
serde v1.0.130 -> v1.0.132
|
||||
syn v1.0.75 -> v1.0.84
|
||||
unicode-width v0.1.8 -> v0.1.9
|
||||
version_check v0.9.3 -> v0.9.4
|
||||
|
||||
* Bump serde_yaml from 0.8.21 to 0.8.23 (#52) [dependabot[bot]]
|
||||
|
||||
Bumps [serde_yaml](https://github.com/dtolnay/serde-yaml) from 0.8.21 to 0.8.23.
|
||||
- [Release notes](https://github.com/dtolnay/serde-yaml/releases)
|
||||
- [Commits](https://github.com/dtolnay/serde-yaml/compare/0.8.21...0.8.23)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: serde_yaml
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Bump pulldown-cmark-to-cmark from 7.1.0 to 7.1.1 (#51) [dependabot[bot]]
|
||||
|
||||
Bumps [pulldown-cmark-to-cmark](https://github.com/Byron/pulldown-cmark-to-cmark) from 7.1.0 to 7.1.1.
|
||||
- [Release notes](https://github.com/Byron/pulldown-cmark-to-cmark/releases)
|
||||
- [Changelog](https://github.com/Byron/pulldown-cmark-to-cmark/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/Byron/pulldown-cmark-to-cmark/compare/v7.1.0...v7.1.1)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pulldown-cmark-to-cmark
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Bump pulldown-cmark-to-cmark from 7.0.0 to 7.1.0 (#48) [dependabot[bot]]
|
||||
|
||||
Bumps [pulldown-cmark-to-cmark](https://github.com/Byron/pulldown-cmark-to-cmark) from 7.0.0 to 7.1.0.
|
||||
- [Release notes](https://github.com/Byron/pulldown-cmark-to-cmark/releases)
|
||||
- [Changelog](https://github.com/Byron/pulldown-cmark-to-cmark/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/Byron/pulldown-cmark-to-cmark/compare/v7.0.0...v7.1.0)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pulldown-cmark-to-cmark
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-minor
|
||||
...
|
||||
|
||||
* Bump pulldown-cmark-to-cmark from 6.0.4 to 7.0.0 (#47) [dependabot[bot]]
|
||||
|
||||
Bumps [pulldown-cmark-to-cmark](https://github.com/Byron/pulldown-cmark-to-cmark) from 6.0.4 to 7.0.0.
|
||||
- [Release notes](https://github.com/Byron/pulldown-cmark-to-cmark/releases)
|
||||
- [Changelog](https://github.com/Byron/pulldown-cmark-to-cmark/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/Byron/pulldown-cmark-to-cmark/compare/v6.0.4...v7.0.0)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pulldown-cmark-to-cmark
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-major
|
||||
...
|
||||
|
||||
* Bump pathdiff from 0.2.0 to 0.2.1 (#46) [dependabot[bot]]
|
||||
|
||||
Bumps [pathdiff](https://github.com/Manishearth/pathdiff) from 0.2.0 to 0.2.1.
|
||||
- [Release notes](https://github.com/Manishearth/pathdiff/releases)
|
||||
- [Commits](https://github.com/Manishearth/pathdiff/commits)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pathdiff
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Bump pulldown-cmark-to-cmark from 6.0.3 to 6.0.4 (#44) [dependabot[bot]]
|
||||
|
||||
Bumps [pulldown-cmark-to-cmark](https://github.com/Byron/pulldown-cmark-to-cmark) from 6.0.3 to 6.0.4.
|
||||
- [Release notes](https://github.com/Byron/pulldown-cmark-to-cmark/releases)
|
||||
- [Changelog](https://github.com/Byron/pulldown-cmark-to-cmark/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/Byron/pulldown-cmark-to-cmark/compare/v6.0.3...v6.0.4)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pulldown-cmark-to-cmark
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Bump pretty_assertions from 0.7.2 to 1.0.0 (#45) [dependabot[bot]]
|
||||
|
||||
Bumps [pretty_assertions](https://github.com/colin-kiegel/rust-pretty-assertions) from 0.7.2 to 1.0.0.
|
||||
- [Release notes](https://github.com/colin-kiegel/rust-pretty-assertions/releases)
|
||||
- [Changelog](https://github.com/colin-kiegel/rust-pretty-assertions/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/colin-kiegel/rust-pretty-assertions/compare/v0.7.2...v1.0.0)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pretty_assertions
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-major
|
||||
...
|
||||
|
||||
## v21.9.1 (2021-09-24)
|
||||
|
||||
### Changes
|
||||
|
||||
* Treat SVG files as embeddable images. [Narayan Sainaney]
|
||||
|
||||
This will ensure SVG files are included as an image when using `![[foo.svg]]` syntax, as opposed to only being linked to.
|
||||
|
||||
### Other
|
||||
|
||||
* Bump pulldown-cmark-to-cmark from 6.0.2 to 6.0.3. [dependabot[bot]]
|
||||
|
||||
Bumps [pulldown-cmark-to-cmark](https://github.com/Byron/pulldown-cmark-to-cmark) from 6.0.2 to 6.0.3.
|
||||
- [Release notes](https://github.com/Byron/pulldown-cmark-to-cmark/releases)
|
||||
- [Changelog](https://github.com/Byron/pulldown-cmark-to-cmark/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/Byron/pulldown-cmark-to-cmark/compare/v6.0.2...v6.0.3)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: pulldown-cmark-to-cmark
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Bump serde_yaml from 0.8.20 to 0.8.21. [dependabot[bot]]
|
||||
|
||||
Bumps [serde_yaml](https://github.com/dtolnay/serde-yaml) from 0.8.20 to 0.8.21.
|
||||
- [Release notes](https://github.com/dtolnay/serde-yaml/releases)
|
||||
- [Commits](https://github.com/dtolnay/serde-yaml/compare/0.8.20...0.8.21)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: serde_yaml
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
|
||||
|
||||
## v21.9.0 (2021-09-12)
|
||||
|
||||
> This release switches to a [calendar versioning scheme](https://calver.org/overview.html).
|
||||
> Details on this decision can be read in [switching obsidian-export to CalVer](https://nick.groenen.me/posts/switching-obsidian-export-to-calver/).
|
||||
|
||||
### New
|
||||
|
||||
* Support postprocessors running on embedded notes. [Nick Groenen]
|
||||
|
||||
This introduces support for postprocessors that are run on the result of
|
||||
a note that is being embedded into another note. This differs from the
|
||||
existing postprocessors (which remain unchanged) that run once all
|
||||
embeds have been processed and merged with the final note.
|
||||
|
||||
These "embed postprocessors" may be set through the new
|
||||
`Exporter::add_embed_postprocessor` method.
|
||||
|
||||
* Add start_at option to export a partial vault. [Nick Groenen]
|
||||
|
||||
This introduces a new `--start-at` CLI argument and corresponding
|
||||
`start_at()` method on the Exporter type that allows exporting of only a
|
||||
given subdirectory within a vault.
|
||||
|
||||
See the updated README file for more details on when and how this may be
|
||||
used.
|
||||
|
||||
### Other
|
||||
|
||||
* Don't build docs for the bin target. [Nick Groenen]
|
||||
|
||||
The library contains documentation covering both CLI and library usage,
|
||||
there's no separate documentation for just the binary target.
|
||||
|
||||
* Move postprocessor tests into their own file for clarity. [Nick Groenen]
|
||||
|
||||
* Update indirect dependencies. [Nick Groenen]
|
||||
|
||||
* Bump serde_yaml from 0.8.19 to 0.8.20. [dependabot[bot]]
|
||||
|
||||
Bumps [serde_yaml](https://github.com/dtolnay/serde-yaml) from 0.8.19 to 0.8.20.
|
||||
- [Release notes](https://github.com/dtolnay/serde-yaml/releases)
|
||||
- [Commits](https://github.com/dtolnay/serde-yaml/compare/0.8.19...0.8.20)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: serde_yaml
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Don't borrow references that are immediately dereferenced. [Nick Groenen]
|
||||
|
||||
This was caught by a recently introduced clippy rule
|
||||
|
||||
* Bump serde_yaml from 0.8.17 to 0.8.19. [dependabot[bot]]
|
||||
|
||||
Bumps [serde_yaml](https://github.com/dtolnay/serde-yaml) from 0.8.17 to 0.8.19.
|
||||
- [Release notes](https://github.com/dtolnay/serde-yaml/releases)
|
||||
- [Commits](https://github.com/dtolnay/serde-yaml/compare/0.8.17...0.8.19)
|
||||
|
||||
---
|
||||
updated-dependencies:
|
||||
- dependency-name: serde_yaml
|
||||
dependency-type: direct:production
|
||||
update-type: version-update:semver-patch
|
||||
...
|
||||
|
||||
* Update dependencies. [Nick Groenen]
|
||||
|
||||
* Fix 4 new clippy lints. [Nick Groenen]
|
||||
|
||||
* Bump regex from 1.4.6 to 1.5.3. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.6 to 1.5.3.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.6...1.5.3)
|
||||
|
||||
* Bump pretty_assertions from 0.7.1 to 0.7.2. [dependabot[bot]]
|
||||
|
||||
Bumps [pretty_assertions](https://github.com/colin-kiegel/rust-pretty-assertions) from 0.7.1 to 0.7.2.
|
||||
- [Release notes](https://github.com/colin-kiegel/rust-pretty-assertions/releases)
|
||||
- [Changelog](https://github.com/colin-kiegel/rust-pretty-assertions/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/colin-kiegel/rust-pretty-assertions/compare/v0.7.1...v0.7.2)
|
||||
|
||||
* Bump regex from 1.4.5 to 1.4.6. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.5 to 1.4.6.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.5...1.4.6)
|
||||
|
||||
## v0.7.0 (2021-04-11)
|
||||
|
||||
### New
|
||||
|
||||
* Postprocessing support. [Nick Groenen]
|
||||
|
||||
Add support for postprocessing of Markdown prior to writing converted
|
||||
notes to disk.
|
||||
|
||||
Postprocessors may be used when making use of Obsidian export as a Rust
|
||||
library to do the following:
|
||||
|
||||
1. Modify a note's `Context`, for example to change the destination
|
||||
filename or update its Frontmatter.
|
||||
2. Change a note's contents by altering `MarkdownEvents`.
|
||||
3. Prevent later postprocessors from running or cause a note to be
|
||||
skipped entirely.
|
||||
|
||||
Future releases of Obsidian export may come with built-in postprocessors
|
||||
for users of the command-line tool to use, if general use-cases can be
|
||||
identified.
|
||||
|
||||
For example, a future release might include functionality to make notes
|
||||
more suitable for the Hugo static site generator. This functionality
|
||||
would be implemented as a postprocessor that could be enabled through
|
||||
command-line flags.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Also percent-encode `?` in filenames. [Nick Groenen]
|
||||
|
||||
A recent Obsidian update expanded the list of allowed characters in
|
||||
filenames, which now includes `?` as well. This needs to be
|
||||
percent-encoded for proper links in static site generators like Hugo.
|
||||
|
||||
### Other
|
||||
|
||||
* Bump pretty_assertions from 0.6.1 to 0.7.1. [dependabot[bot]]
|
||||
|
||||
Bumps [pretty_assertions](https://github.com/colin-kiegel/rust-pretty-assertions) from 0.6.1 to 0.7.1.
|
||||
- [Release notes](https://github.com/colin-kiegel/rust-pretty-assertions/releases)
|
||||
- [Changelog](https://github.com/colin-kiegel/rust-pretty-assertions/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/colin-kiegel/rust-pretty-assertions/compare/v0.6.1...v0.7.1)
|
||||
|
||||
* Bump walkdir from 2.3.1 to 2.3.2. [dependabot[bot]]
|
||||
|
||||
Bumps [walkdir](https://github.com/BurntSushi/walkdir) from 2.3.1 to 2.3.2.
|
||||
- [Release notes](https://github.com/BurntSushi/walkdir/releases)
|
||||
- [Commits](https://github.com/BurntSushi/walkdir/compare/2.3.1...2.3.2)
|
||||
|
||||
* Bump regex from 1.4.3 to 1.4.5. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.3 to 1.4.5.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.3...1.4.5)
|
||||
|
||||
## v0.6.0 (2021-02-15)
|
||||
|
||||
### New
|
||||
|
||||
* Add `--version` flag. [Nick Groenen]
|
||||
|
||||
### Changes
|
||||
|
||||
* Don't Box FilterFn in WalkOptions. [Nick Groenen]
|
||||
|
||||
Previously, `filter_fn` on the `WalkOptions` struct looked like:
|
||||
|
||||
pub filter_fn: Option<Box<&'static FilterFn>>,
|
||||
|
||||
This boxing was unneccesary and has been changed to:
|
||||
|
||||
pub filter_fn: Option<&'static FilterFn>,
|
||||
|
||||
This will only affect people who use obsidian-export as a library in
|
||||
other Rust programs, not users of the CLI.
|
||||
|
||||
For those library users, they no longer need to supply `FilterFn`
|
||||
wrapped in a Box.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Recognize notes beginning with underscores. [Nick Groenen]
|
||||
|
||||
Notes with an underscore would fail to be recognized within Obsidian
|
||||
`[[_WikiLinks]]` due to the assumption that the underlying Markdown
|
||||
parser (pulldown_cmark) would emit the text between `[[` and `]]` as
|
||||
a single event.
|
||||
|
||||
The note parser has now been rewritten to use a more reliable state
|
||||
machine which correctly recognizes this corner-case (and likely some
|
||||
others).
|
||||
|
||||
* Support self-references. [Joshua Coles]
|
||||
|
||||
This ensures links to headings within the same note (`[[#Heading]]`)
|
||||
resolve correctly.
|
||||
|
||||
### Other
|
||||
|
||||
* Avoid redundant "Release" in GitHub release titles. [Nick Groenen]
|
||||
|
||||
* Add failing testcase for files with underscores. [Nick Groenen]
|
||||
|
||||
* Add unit tests for display of ObsidianNoteReference. [Nick Groenen]
|
||||
|
||||
* Add some unit tests for ObsidianNoteReference::from_str. [Nick Groenen]
|
||||
|
||||
* Also run tests on pull requests. [Nick Groenen]
|
||||
|
||||
* Apply clippy suggestions following rust 1.50.0. [Nick Groenen]
|
||||
|
||||
* Fix infinite recursion bug with references to current file. [Joshua Coles]
|
||||
|
||||
* Add tests for self-references. [Joshua Coles]
|
||||
|
||||
Note as there is no support for block references at the moment, the generated link goes nowhere, however it is to a reasonable ID
|
||||
|
||||
* Bump tempfile from 3.1.0 to 3.2.0. [dependabot[bot]]
|
||||
|
||||
Bumps [tempfile](https://github.com/Stebalien/tempfile) from 3.1.0 to 3.2.0.
|
||||
- [Release notes](https://github.com/Stebalien/tempfile/releases)
|
||||
- [Changelog](https://github.com/Stebalien/tempfile/blob/master/NEWS)
|
||||
- [Commits](https://github.com/Stebalien/tempfile/commits)
|
||||
|
||||
* Bump eyre from 0.6.3 to 0.6.5. [dependabot[bot]]
|
||||
|
||||
Bumps [eyre](https://github.com/yaahc/eyre) from 0.6.3 to 0.6.5.
|
||||
- [Release notes](https://github.com/yaahc/eyre/releases)
|
||||
- [Changelog](https://github.com/yaahc/eyre/blob/v0.6.5/CHANGELOG.md)
|
||||
- [Commits](https://github.com/yaahc/eyre/compare/v0.6.3...v0.6.5)
|
||||
|
||||
* Bump regex from 1.4.2 to 1.4.3. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.2 to 1.4.3.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.2...1.4.3)
|
||||
|
||||
|
||||
|
||||
## v0.5.1 (2021-01-10)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Find uppercased notes when referenced with lowercase. [Nick Groenen]
|
||||
|
||||
This commit fixes a bug where, if a note contained uppercase characters
|
||||
(for example `Note.md`) but was referred to using lowercase
|
||||
(`[[note]]`), that note would not be found.
|
||||
|
||||
|
||||
|
||||
## v0.5.0 (2021-01-05)
|
||||
|
||||
### New
|
||||
|
||||
* Add --no-recursive-embeds to break infinite recursion cycles. [Nick Groenen]
|
||||
|
||||
It's possible to end up with "recursive embeds" when two notes embed
|
||||
each other. This happens for example when a `Note A.md` contains
|
||||
`![[Note B]]` but `Note B.md` also contains `![[Note A]]`.
|
||||
|
||||
By default, this will trigger an error and display the chain of notes
|
||||
which caused the recursion.
|
||||
|
||||
Using the new `--no-recursive-embeds`, if a note is encountered for a
|
||||
second time while processing the original note, rather than embedding it
|
||||
again a link to the note is inserted instead to break the cycle.
|
||||
|
||||
See also: https://github.com/zoni/obsidian-export/issues/1
|
||||
|
||||
* Make walk options configurable on CLI. [Nick Groenen]
|
||||
|
||||
By default hidden files, patterns listed in `.export-ignore` as well as
|
||||
any files ignored by git are excluded from exports. This behavior has
|
||||
been made configurable on the CLI using the new flags `--hidden`,
|
||||
`--ignore-file` and `--no-git`.
|
||||
|
||||
* Support links referencing headings. [Nick Groenen]
|
||||
|
||||
Previously, links referencing a heading (`[[note#heading]]`) would just
|
||||
link to the file name without including an anchor in the link target.
|
||||
Now, such references will include an appropriate `#anchor` attribute.
|
||||
|
||||
Note that neither the original Markdown specification, nor the more
|
||||
recent CommonMark standard, specify how anchors should be constructed
|
||||
for a given heading.
|
||||
|
||||
There are also some differences between the various Markdown rendering
|
||||
implementations.
|
||||
|
||||
Obsidian-export uses the [slug] crate to generate anchors which should
|
||||
be compatible with most implementations, however your mileage may vary.
|
||||
|
||||
(For example, GitHub may leave a trailing `-` on anchors when headings
|
||||
end with a smiley. The slug library, and thus obsidian-export, will
|
||||
avoid such dangling dashes).
|
||||
|
||||
[slug]: https://crates.io/crates/slug
|
||||
|
||||
* Support embeds referencing headings. [Nick Groenen]
|
||||
|
||||
Previously, partial embeds (`![[note#heading]]`) would always include
|
||||
the entire file into the source note. Now, such embeds will only include
|
||||
the contents of the referenced heading (and any subheadings).
|
||||
|
||||
Links and embeds of [arbitrary blocks] remains unsupported at this time.
|
||||
|
||||
[arbitrary blocks]: https://publish.obsidian.md/help/How+to/Link+to+blocks
|
||||
|
||||
### Changes
|
||||
|
||||
* Print warnings to stderr rather than stdout. [Nick Groenen]
|
||||
|
||||
Warning messages emitted when encountering broken links/references will
|
||||
now be printed to stderr as opposed to stdout.
|
||||
|
||||
### Other
|
||||
|
||||
* Include filter_fn field in WalkOptions debug display. [Nick Groenen]
|
||||
|
||||
|
||||
|
||||
## v0.4.0 (2020-12-23)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Correct relative links within embedded notes. [Nick Groenen]
|
||||
|
||||
Links within an embedded note would point to other local resources
|
||||
relative to the filesystem location of the note being embedded.
|
||||
|
||||
When a note inside a different directory would embed such a note, these
|
||||
links would point to invalid locations.
|
||||
|
||||
Now these links are calculated relative to the top note, which ensures
|
||||
these links will point to the right path.
|
||||
|
||||
### Other
|
||||
|
||||
* Add brief library documentation to all public types and functions. [Nick Groenen]
|
||||
|
||||
|
||||
|
||||
## v0.3.0 (2020-12-21)
|
||||
|
||||
### New
|
||||
|
||||
* Report file tree when RecursionLimitExceeded is hit. [Nick Groenen]
|
||||
|
||||
This refactors the Context to maintain a list of all the files which
|
||||
have been processed so far in a chain of embeds. This information is
|
||||
then used to print a more helpful error message to users of the CLI when
|
||||
RecursionLimitExceeded is returned.
|
||||
|
||||
### Changes
|
||||
|
||||
* Add extra whitespace around multi-line warnings. [Nick Groenen]
|
||||
|
||||
This makes errors a bit easier to distinguish after a number of warnings
|
||||
has been printed.
|
||||
|
||||
### Other
|
||||
|
||||
* Setup gitchangelog. [Nick Groenen]
|
||||
|
||||
This adds a changelog (CHANGES.md) which is automatically generated with
|
||||
[gitchangelog].
|
||||
|
||||
[gitchangelog]: https://github.com/vaab/gitchangelog
|
||||
|
||||
|
||||
|
||||
## v0.2.0 (2020-12-13)
|
||||
|
||||
* Allow custom filter function to be passed with WalkOptions. [Nick Groenen]
|
||||
|
||||
* Re-export vault_contents and WalkOptions as pub from crate root. [Nick Groenen]
|
||||
|
||||
* Run mdbook hook against README.md too. [Nick Groenen]
|
||||
|
||||
* Update installation instructions. [Nick Groenen]
|
||||
|
||||
Installation no longer requires a git repository URL now that a crate is
|
||||
published.
|
||||
|
||||
* Add MdBook generation script and precommit hook. [Nick Groenen]
|
||||
|
||||
* Add more reliable non-ASCII tetscase. [Nick Groenen]
|
||||
|
||||
* Create FUNDING.yml. [Nick Groenen]
|
||||
|
||||
## v0.1.0 (2020-11-28)
|
||||
|
||||
* Public release. [Nick Groenen]
|
299
CHANGES.md
299
CHANGES.md
|
@ -1,299 +0,0 @@
|
|||
# Changelog
|
||||
|
||||
## v0.7.0 (2021-04-11)
|
||||
|
||||
### New
|
||||
|
||||
* Postprocessing support. [Nick Groenen]
|
||||
|
||||
Add support for postprocessing of Markdown prior to writing converted
|
||||
notes to disk.
|
||||
|
||||
Postprocessors may be used when making use of Obsidian export as a Rust
|
||||
library to do the following:
|
||||
|
||||
1. Modify a note's `Context`, for example to change the destination
|
||||
filename or update its Frontmatter.
|
||||
2. Change a note's contents by altering `MarkdownEvents`.
|
||||
3. Prevent later postprocessors from running or cause a note to be
|
||||
skipped entirely.
|
||||
|
||||
Future releases of Obsidian export may come with built-in postprocessors
|
||||
for users of the command-line tool to use, if general use-cases can be
|
||||
identified.
|
||||
|
||||
For example, a future release might include functionality to make notes
|
||||
more suitable for the Hugo static site generator. This functionality
|
||||
would be implemented as a postprocessor that could be enabled through
|
||||
command-line flags.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Also percent-encode `?` in filenames. [Nick Groenen]
|
||||
|
||||
A recent Obsidian update expanded the list of allowed characters in
|
||||
filenames, which now includes `?` as well. This needs to be
|
||||
percent-encoded for proper links in static site generators like Hugo.
|
||||
|
||||
### Other
|
||||
|
||||
* Bump pretty_assertions from 0.6.1 to 0.7.1. [dependabot[bot]]
|
||||
|
||||
Bumps [pretty_assertions](https://github.com/colin-kiegel/rust-pretty-assertions) from 0.6.1 to 0.7.1.
|
||||
- [Release notes](https://github.com/colin-kiegel/rust-pretty-assertions/releases)
|
||||
- [Changelog](https://github.com/colin-kiegel/rust-pretty-assertions/blob/main/CHANGELOG.md)
|
||||
- [Commits](https://github.com/colin-kiegel/rust-pretty-assertions/compare/v0.6.1...v0.7.1)
|
||||
|
||||
* Bump walkdir from 2.3.1 to 2.3.2. [dependabot[bot]]
|
||||
|
||||
Bumps [walkdir](https://github.com/BurntSushi/walkdir) from 2.3.1 to 2.3.2.
|
||||
- [Release notes](https://github.com/BurntSushi/walkdir/releases)
|
||||
- [Commits](https://github.com/BurntSushi/walkdir/compare/2.3.1...2.3.2)
|
||||
|
||||
* Bump regex from 1.4.3 to 1.4.5. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.3 to 1.4.5.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.3...1.4.5)
|
||||
|
||||
## v0.6.0 (2021-02-15)
|
||||
|
||||
### New
|
||||
|
||||
* Add `--version` flag. [Nick Groenen]
|
||||
|
||||
### Changes
|
||||
|
||||
* Don't Box FilterFn in WalkOptions. [Nick Groenen]
|
||||
|
||||
Previously, `filter_fn` on the `WalkOptions` struct looked like:
|
||||
|
||||
pub filter_fn: Option<Box<&'static FilterFn>>,
|
||||
|
||||
This boxing was unneccesary and has been changed to:
|
||||
|
||||
pub filter_fn: Option<&'static FilterFn>,
|
||||
|
||||
This will only affect people who use obsidian-export as a library in
|
||||
other Rust programs, not users of the CLI.
|
||||
|
||||
For those library users, they no longer need to supply `FilterFn`
|
||||
wrapped in a Box.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Recognize notes beginning with underscores. [Nick Groenen]
|
||||
|
||||
Notes with an underscore would fail to be recognized within Obsidian
|
||||
`[[_WikiLinks]]` due to the assumption that the underlying Markdown
|
||||
parser (pulldown_cmark) would emit the text between `[[` and `]]` as
|
||||
a single event.
|
||||
|
||||
The note parser has now been rewritten to use a more reliable state
|
||||
machine which correctly recognizes this corner-case (and likely some
|
||||
others).
|
||||
|
||||
* Support self-references. [Joshua Coles]
|
||||
|
||||
This ensures links to headings within the same note (`[[#Heading]]`)
|
||||
resolve correctly.
|
||||
|
||||
### Other
|
||||
|
||||
* Avoid redundant "Release" in GitHub release titles. [Nick Groenen]
|
||||
|
||||
* Add failing testcase for files with underscores. [Nick Groenen]
|
||||
|
||||
* Add unit tests for display of ObsidianNoteReference. [Nick Groenen]
|
||||
|
||||
* Add some unit tests for ObsidianNoteReference::from_str. [Nick Groenen]
|
||||
|
||||
* Also run tests on pull requests. [Nick Groenen]
|
||||
|
||||
* Apply clippy suggestions following rust 1.50.0. [Nick Groenen]
|
||||
|
||||
* Fix infinite recursion bug with references to current file. [Joshua Coles]
|
||||
|
||||
* Add tests for self-references. [Joshua Coles]
|
||||
|
||||
Note as there is no support for block references at the moment, the generated link goes nowhere, however it is to a reasonable ID
|
||||
|
||||
* Bump tempfile from 3.1.0 to 3.2.0. [dependabot[bot]]
|
||||
|
||||
Bumps [tempfile](https://github.com/Stebalien/tempfile) from 3.1.0 to 3.2.0.
|
||||
- [Release notes](https://github.com/Stebalien/tempfile/releases)
|
||||
- [Changelog](https://github.com/Stebalien/tempfile/blob/master/NEWS)
|
||||
- [Commits](https://github.com/Stebalien/tempfile/commits)
|
||||
|
||||
* Bump eyre from 0.6.3 to 0.6.5. [dependabot[bot]]
|
||||
|
||||
Bumps [eyre](https://github.com/yaahc/eyre) from 0.6.3 to 0.6.5.
|
||||
- [Release notes](https://github.com/yaahc/eyre/releases)
|
||||
- [Changelog](https://github.com/yaahc/eyre/blob/v0.6.5/CHANGELOG.md)
|
||||
- [Commits](https://github.com/yaahc/eyre/compare/v0.6.3...v0.6.5)
|
||||
|
||||
* Bump regex from 1.4.2 to 1.4.3. [dependabot[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.2 to 1.4.3.
|
||||
- [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
- [Commits](https://github.com/rust-lang/regex/compare/1.4.2...1.4.3)
|
||||
|
||||
|
||||
|
||||
## v0.5.1 (2021-01-10)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Find uppercased notes when referenced with lowercase. [Nick Groenen]
|
||||
|
||||
This commit fixes a bug where, if a note contained uppercase characters
|
||||
(for example `Note.md`) but was referred to using lowercase
|
||||
(`[[note]]`), that note would not be found.
|
||||
|
||||
|
||||
|
||||
## v0.5.0 (2021-01-05)
|
||||
|
||||
### New
|
||||
|
||||
* Add --no-recursive-embeds to break infinite recursion cycles. [Nick Groenen]
|
||||
|
||||
It's possible to end up with "recursive embeds" when two notes embed
|
||||
each other. This happens for example when a `Note A.md` contains
|
||||
`![[Note B]]` but `Note B.md` also contains `![[Note A]]`.
|
||||
|
||||
By default, this will trigger an error and display the chain of notes
|
||||
which caused the recursion.
|
||||
|
||||
Using the new `--no-recursive-embeds`, if a note is encountered for a
|
||||
second time while processing the original note, rather than embedding it
|
||||
again a link to the note is inserted instead to break the cycle.
|
||||
|
||||
See also: https://github.com/zoni/obsidian-export/issues/1
|
||||
|
||||
* Make walk options configurable on CLI. [Nick Groenen]
|
||||
|
||||
By default hidden files, patterns listed in `.export-ignore` as well as
|
||||
any files ignored by git are excluded from exports. This behavior has
|
||||
been made configurable on the CLI using the new flags `--hidden`,
|
||||
`--ignore-file` and `--no-git`.
|
||||
|
||||
* Support links referencing headings. [Nick Groenen]
|
||||
|
||||
Previously, links referencing a heading (`[[note#heading]]`) would just
|
||||
link to the file name without including an anchor in the link target.
|
||||
Now, such references will include an appropriate `#anchor` attribute.
|
||||
|
||||
Note that neither the original Markdown specification, nor the more
|
||||
recent CommonMark standard, specify how anchors should be constructed
|
||||
for a given heading.
|
||||
|
||||
There are also some differences between the various Markdown rendering
|
||||
implementations.
|
||||
|
||||
Obsidian-export uses the [slug] crate to generate anchors which should
|
||||
be compatible with most implementations, however your mileage may vary.
|
||||
|
||||
(For example, GitHub may leave a trailing `-` on anchors when headings
|
||||
end with a smiley. The slug library, and thus obsidian-export, will
|
||||
avoid such dangling dashes).
|
||||
|
||||
[slug]: https://crates.io/crates/slug
|
||||
|
||||
* Support embeds referencing headings. [Nick Groenen]
|
||||
|
||||
Previously, partial embeds (`![[note#heading]]`) would always include
|
||||
the entire file into the source note. Now, such embeds will only include
|
||||
the contents of the referenced heading (and any subheadings).
|
||||
|
||||
Links and embeds of [arbitrary blocks] remains unsupported at this time.
|
||||
|
||||
[arbitrary blocks]: https://publish.obsidian.md/help/How+to/Link+to+blocks
|
||||
|
||||
### Changes
|
||||
|
||||
* Print warnings to stderr rather than stdout. [Nick Groenen]
|
||||
|
||||
Warning messages emitted when encountering broken links/references will
|
||||
now be printed to stderr as opposed to stdout.
|
||||
|
||||
### Other
|
||||
|
||||
* Include filter_fn field in WalkOptions debug display. [Nick Groenen]
|
||||
|
||||
|
||||
|
||||
## v0.4.0 (2020-12-23)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Correct relative links within embedded notes. [Nick Groenen]
|
||||
|
||||
Links within an embedded note would point to other local resources
|
||||
relative to the filesystem location of the note being embedded.
|
||||
|
||||
When a note inside a different directory would embed such a note, these
|
||||
links would point to invalid locations.
|
||||
|
||||
Now these links are calculated relative to the top note, which ensures
|
||||
these links will point to the right path.
|
||||
|
||||
### Other
|
||||
|
||||
* Add brief library documentation to all public types and functions. [Nick Groenen]
|
||||
|
||||
|
||||
|
||||
## v0.3.0 (2020-12-21)
|
||||
|
||||
### New
|
||||
|
||||
* Report file tree when RecursionLimitExceeded is hit. [Nick Groenen]
|
||||
|
||||
This refactors the Context to maintain a list of all the files which
|
||||
have been processed so far in a chain of embeds. This information is
|
||||
then used to print a more helpful error message to users of the CLI when
|
||||
RecursionLimitExceeded is returned.
|
||||
|
||||
### Changes
|
||||
|
||||
* Add extra whitespace around multi-line warnings. [Nick Groenen]
|
||||
|
||||
This makes errors a bit easier to distinguish after a number of warnings
|
||||
has been printed.
|
||||
|
||||
### Other
|
||||
|
||||
* Setup gitchangelog. [Nick Groenen]
|
||||
|
||||
This adds a changelog (CHANGES.md) which is automatically generated with
|
||||
[gitchangelog].
|
||||
|
||||
[gitchangelog]: https://github.com/vaab/gitchangelog
|
||||
|
||||
|
||||
|
||||
## v0.2.0 (2020-12-13)
|
||||
|
||||
* Allow custom filter function to be passed with WalkOptions. [Nick Groenen]
|
||||
|
||||
* Re-export vault_contents and WalkOptions as pub from crate root. [Nick Groenen]
|
||||
|
||||
* Run mdbook hook against README.md too. [Nick Groenen]
|
||||
|
||||
* Update installation instructions. [Nick Groenen]
|
||||
|
||||
Installation no longer requires a git repository URL now that a crate is
|
||||
published.
|
||||
|
||||
* Add MdBook generation script and precommit hook. [Nick Groenen]
|
||||
|
||||
* Add more reliable non-ASCII tetscase. [Nick Groenen]
|
||||
|
||||
* Create FUNDING.yml. [Nick Groenen]
|
||||
|
||||
## v0.1.0 (2020-11-28)
|
||||
|
||||
* Public release. [Nick Groenen]
|
|
@ -0,0 +1,75 @@
|
|||
# Contributing to Obsidian Export
|
||||
|
||||
Hi there!
|
||||
Thank you so much for wanting to contribute to this project.
|
||||
I greatly appreciate any efforts people like you put into making obsidian-export better!
|
||||
|
||||
Managing an open-source project can take a lot of time and effort however.
|
||||
As this is a passion project which I maintain alongside my regular daytime job, I need to take some measures to safeguard my mental health and the enjoyment of this project.
|
||||
|
||||
This document aims to provide guidance which makes contributions easier by:
|
||||
|
||||
1. Defining the expectations I have of submissions to the codebase and the pull request process.
|
||||
2. Helping you get set up for development on the code.
|
||||
3. Providing pointers to some areas of the codebase, as well as some design considerations to take into account when making changes.
|
||||
|
||||
## Working with Rust
|
||||
|
||||
Obsidian-export is written in [Rust](https://www.rust-lang.org/), which is not the easiest of languages to master.
|
||||
If you'd like to contribute but you don't know Rust, check out [Learn Rust](https://www.rust-lang.org/learn) for some suggestions of how to get started with the language.
|
||||
In general, I will do my best to support you and help you out, but understand my time for mentoring is highly limited.
|
||||
|
||||
To work on the codebase, you'll also need the Rust toolchain, including cargo, rustfmt and clippy.
|
||||
The easiest way is to [install Rust using rustup](https://www.rust-lang.org/tools/install), which lets you install rustfmt and clippy using `rustup component add rustfmt` and `rustup component add clippy` respectively.
|
||||
|
||||
## Design principles
|
||||
|
||||
My intention is to keep the core of `obsidian-export` as limited and small as possible, avoiding changes to the core [`Exporter`](https://docs.rs/obsidian-export/latest/obsidian_export/struct.Exporter.html) struct or any of its methods whenever possible.
|
||||
This improves long-term maintainability and makes investigation of bugs simpler.
|
||||
|
||||
To keep the core of obsidian-export small while still supporting a wide range of use-cases, additional functionality should be pushed down into [postprocessors](https://docs.rs/obsidian-export/latest/obsidian_export/type.Postprocessor.html) as much as possible.
|
||||
You can see some examples of this in:
|
||||
|
||||
- [Support Obsidian's "Strict line breaks" setting (#57)](https://github.com/zoni/obsidian-export/pull/57)
|
||||
- [Frontmatter based filtering (#67)](https://github.com/zoni/obsidian-export/pull/67)
|
||||
|
||||
## Conventions
|
||||
|
||||
Code is formatted with [rustfmt](https://github.com/rust-lang/rustfmt) using the default options.
|
||||
In addition, all default [clippy](https://github.com/rust-lang/rust-clippy) checks on the latest stable Rust compiler must also pass.
|
||||
Both of these are enforced through CI using GitHub actions.
|
||||
|
||||
> **💡 Tip: install pre-commit hooks**
|
||||
>
|
||||
> This codebase is set up with the [pre-commit framework](https://pre-commit.com/) to automatically run the appropriate checks locally whenever you commit.
|
||||
> Assuming you [have pre-commit installed](https://pre-commit.com/#install), all you need to do is run `pre-commit install` once to get this set up.
|
||||
|
||||
Following my advice on [creating high-quality commits](https://nick.groenen.me/notes/high-quality-commits/) will make it easier for me to review changes.
|
||||
I don't insist on this, but pull requests which fail to adhere to these conventions are at risk of being squashed and having their commit messages rewritten when they are accepted.
|
||||
|
||||
## Tests
|
||||
|
||||
In order to have confidence that your changes work as intended, as well as to avoid regressions when making changes in the future, I would like to see code accompanied by test cases.
|
||||
|
||||
At the moment, the test framework primary relies on high-level integration tests, all of which are defined in the [tests](tests/) directory.
|
||||
These rely on comparing Markdown notes [before](tests/testdata/input) and [after](tests/testdata/expected) running an export.
|
||||
By studying some of the existing tests, you should be able to copy and adapt these for your own changes.
|
||||
|
||||
For an example of doing low-level unit tests, you can look at the end of [frontmatter.rs](src/frontmatter.rs).
|
||||
|
||||
## Documentation
|
||||
|
||||
I place a lot of value on good documentation and would encourage you to include updates to the docs with your changes.
|
||||
Changes or additions to public methods and attributes **must** come with proper documentation for a PR to be accepted.
|
||||
|
||||
Advice on writing Rust documentation can be found in:
|
||||
|
||||
- [The rustdoc book: How to write documentation](https://doc.rust-lang.org/rustdoc/how-to-write-documentation.html)
|
||||
- [Rust by example: Documentation](https://doc.rust-lang.org/rust-by-example/meta/doc.html)
|
||||
|
||||
Updates to the user guide/README instructions are also preferred, but optional.
|
||||
If you don't feel comfortable writing user documentation, I will be happy to guide you or do it for you.
|
||||
|
||||
> **⚠ Warning**
|
||||
>
|
||||
> If you update the README file, take note that you must edit the fragments in the [docs](docs/) directory as opposed to the README in the root of the repository, which is auto-generated.
|
File diff suppressed because it is too large
Load Diff
67
Cargo.toml
67
Cargo.toml
|
@ -1,9 +1,9 @@
|
|||
[package]
|
||||
name = "obsidian-export"
|
||||
version = "0.7.0"
|
||||
version = "23.12.0"
|
||||
authors = ["Nick Groenen <nick@groenen.me>"]
|
||||
edition = "2018"
|
||||
license = "MIT OR Apache-2.0"
|
||||
license = "BSD-2-Clause-Patent"
|
||||
readme = "README.md"
|
||||
repository = "https://github.com/zoni/obsidian-export"
|
||||
documentation = "https://docs.rs/obsidian-export"
|
||||
|
@ -20,26 +20,59 @@ path = "src/lib.rs"
|
|||
[[bin]]
|
||||
name = "obsidian-export"
|
||||
path = "src/main.rs"
|
||||
doc = false
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
eyre = "0.6.5"
|
||||
gumdrop = "0.8.0"
|
||||
ignore = "0.4.17"
|
||||
eyre = "0.6.9"
|
||||
gumdrop = "0.8.1"
|
||||
ignore = "0.4.21"
|
||||
lazy_static = "1.4.0"
|
||||
matter = "0.1.0-alpha4"
|
||||
pathdiff = "0.2.0"
|
||||
percent-encoding = "2.1.0"
|
||||
pulldown-cmark = "0.8.0"
|
||||
pulldown-cmark-to-cmark = "6.0.0"
|
||||
rayon = "1.5.0"
|
||||
regex = "1.4.5"
|
||||
serde_yaml = "0.8.17"
|
||||
slug = "0.1.4"
|
||||
snafu = "0.6.10"
|
||||
pathdiff = "0.2.1"
|
||||
percent-encoding = "2.3.1"
|
||||
pulldown-cmark = "0.9.3"
|
||||
pulldown-cmark-to-cmark = "11.0.2"
|
||||
rayon = "1.8.0"
|
||||
regex = "1.10.2"
|
||||
serde_yaml = "0.9.27"
|
||||
slug = "0.1.5"
|
||||
snafu = "0.7.5"
|
||||
unicode-normalization = "0.1.22"
|
||||
|
||||
[dev-dependencies]
|
||||
pretty_assertions = "0.7.1"
|
||||
tempfile = "3.2.0"
|
||||
walkdir = "2.3.2"
|
||||
pretty_assertions = "1.4.0"
|
||||
rstest = "0.18.2"
|
||||
tempfile = "3.8.1"
|
||||
walkdir = "2.4.0"
|
||||
|
||||
# The profile that 'cargo dist' will build with
|
||||
[profile.dist]
|
||||
inherits = "release"
|
||||
lto = "thin"
|
||||
|
||||
# Config for 'cargo dist'
|
||||
[workspace.metadata.dist]
|
||||
# The preferred cargo-dist version to use in CI (Cargo.toml SemVer syntax)
|
||||
cargo-dist-version = "0.4.3"
|
||||
# CI backends to support
|
||||
ci = ["github"]
|
||||
# The installers to generate for each app
|
||||
installers = ["shell", "powershell"]
|
||||
# Target platforms to build apps for (Rust target-triple syntax)
|
||||
targets = [
|
||||
#"aarch64-unknown-linux-gnu", # Not yet supported (2023-12-03)
|
||||
"x86_64-unknown-linux-gnu",
|
||||
#"x86_64-unknown-linux-musl",
|
||||
"aarch64-apple-darwin",
|
||||
"x86_64-apple-darwin",
|
||||
# "aarch64-pc-windows-msvc",, # Not yet supported (2023-12-03)
|
||||
"x86_64-pc-windows-msvc",
|
||||
]
|
||||
unix-archive = ".tar.xz"
|
||||
windows-archive = ".zip"
|
||||
# Publish jobs to run in CI
|
||||
pr-run-mode = "plan"
|
||||
# Publish jobs to run in CI
|
||||
publish-jobs = ["./publish-crate"]
|
||||
|
|
|
@ -0,0 +1,45 @@
|
|||
Copyright (c) Nick Groenen <nick@groenen.me>
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright notice,
|
||||
this list of conditions and the following disclaimer.
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
Subject to the terms and conditions of this license, each copyright holder and
|
||||
contributor hereby grants to those receiving rights under this license a
|
||||
perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except for failure to satisfy the conditions of this license) patent license
|
||||
to make, have made, use, offer to sell, sell, import, and otherwise transfer
|
||||
this software, where such license applies only to those patent claims, already
|
||||
acquired or hereafter acquired, licensable by such copyright holder or
|
||||
contributor that are necessarily infringed by:
|
||||
|
||||
(a) their Contribution(s) (the licensed copyrights of copyright holders and
|
||||
non-copyrightable additions of contributors, in source or binary form)
|
||||
alone; or
|
||||
(b) combination of their Contribution(s) with the work of authorship to
|
||||
which such Contribution(s) was added by such copyright holder or
|
||||
contributor, if, at the time the Contribution is added, such addition
|
||||
causes such combination to be necessarily infringed. The patent license
|
||||
shall not apply to any other combinations which include the Contribution.
|
||||
|
||||
Except as expressly stated above, no rights or licenses from any copyright
|
||||
holder or contributor is granted under this license, whether expressly, by
|
||||
implication, estoppel or otherwise.
|
||||
|
||||
DISCLAIMER
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR CONTRIBUTORS BE LIABLE
|
||||
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
|
||||
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
||||
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
201
LICENSE-APACHE
201
LICENSE-APACHE
|
@ -1,201 +0,0 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
25
LICENSE-MIT
25
LICENSE-MIT
|
@ -1,25 +0,0 @@
|
|||
Copyright (c) 2020 Nick Groenen
|
||||
|
||||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
374
README.md
374
README.md
|
@ -6,20 +6,19 @@ WARNING:
|
|||
the docs directory.
|
||||
|
||||
Instead of editing README.md, edit the corresponding Markdown files in the
|
||||
docs directory and run generate.sh.
|
||||
docs directory and run generate.sh.
|
||||
|
||||
To add new sections, create new files under docs and add these to _combined.md
|
||||
|
||||
-->
|
||||
|
||||
|
||||
# Obsidian Export
|
||||
|
||||
*Obsidian Export is a CLI program and a Rust library to export an [Obsidian](https://obsidian.md/) vault to regular Markdown.*
|
||||
*Obsidian Export is a CLI program and a Rust library to export an [Obsidian] vault to regular Markdown.*
|
||||
|
||||
* Recursively export Obsidian Markdown files to [CommonMark](https://commonmark.org/).
|
||||
* Recursively export Obsidian Markdown files to [CommonMark].
|
||||
* Supports `[[note]]`-style references as well as `![[note]]` file includes.
|
||||
* Support for [gitignore](https://git-scm.com/docs/gitignore)-style exclude patterns (default: `.export-ignore`).
|
||||
* Support for [gitignore]-style exclude patterns (default: `.export-ignore`).
|
||||
* Automatically excludes files that are ignored by Git when the vault is located in a Git repository.
|
||||
* Runs on all major platforms: Windows, Mac, Linux, BSDs.
|
||||
|
||||
|
@ -31,21 +30,20 @@ It supports most but not all of Obsidian's Markdown flavor.
|
|||
|
||||
## Pre-built binaries
|
||||
|
||||
Binary releases for x86-64 processors are provided for Windows, Linux and Mac operating systems on a best-effort basis.
|
||||
They are built with GitHub runners as part of the release workflow defined in `.github/workflows/release.yml`.
|
||||
Pre-compiled binaries for all major platforms are available at <https://github.com/zoni/obsidian-export/releases>
|
||||
|
||||
The resulting binaries can be downloaded from [https://github.com/zoni/obsidian-export/releases](https://github.com/zoni/obsidian-export/releases)
|
||||
In addition to the installation scripts provided, these releases are also suitable for [installation with cargo-binstall](https://github.com/cargo-bins/cargo-binstall#readme).
|
||||
|
||||
## Building from source
|
||||
|
||||
When binary releases are unavailable for your platform, or you do not trust the pre-built binaries, then *obsidian-export* can be compiled from source with relatively little effort.
|
||||
This is done through [Cargo](https://doc.rust-lang.org/cargo/), the official package manager for Rust, with the following steps:
|
||||
This is done through [Cargo], the official package manager for Rust, with the following steps:
|
||||
|
||||
1. Install the Rust toolchain from [https://www.rust-lang.org/tools/install](https://www.rust-lang.org/tools/install)
|
||||
1. Install the Rust toolchain from <https://www.rust-lang.org/tools/install>
|
||||
1. Run: `cargo install obsidian-export`
|
||||
|
||||
>
|
||||
> It is expected that you successfully configured the PATH variable correctly while installing the Rust toolchain, as described under *"Configuring the PATH environment variable"* on [https://www.rust-lang.org/tools/install](https://www.rust-lang.org/tools/install).
|
||||
> It is expected that you successfully configured the PATH variable correctly while installing the Rust toolchain, as described under *"Configuring the PATH environment variable"* on <https://www.rust-lang.org/tools/install>.
|
||||
|
||||
## Upgrading from earlier versions
|
||||
|
||||
|
@ -59,7 +57,7 @@ If you built from source, upgrade by running `cargo install obsidian-export` aga
|
|||
The main interface of *obsidian-export* is the `obsidian-export` CLI command.
|
||||
As a text interface, this must be run from a terminal or Windows PowerShell.
|
||||
|
||||
It is assumed that you have basic familiarity with command-line interfaces and that you set up your `PATH` correctly if you installed with `cargo`.
|
||||
It is assumed that you have basic familiarity with command-line interfaces and that you set up your `PATH` correctly if you installed with `cargo`.
|
||||
Running `obsidian-export --version` should print a version number rather than giving some kind of error.
|
||||
|
||||
>
|
||||
|
@ -67,6 +65,8 @@ Running `obsidian-export --version` should print a version number rather than gi
|
|||
>
|
||||
> For example `~/Downloads/obsidian-export --version` on Mac/Linux or `~\Downloads\obsidian-export --version` on Windows (PowerShell).
|
||||
|
||||
## Exporting notes
|
||||
|
||||
In it's most basic form, `obsidian-export` takes just two mandatory arguments, a source and a destination:
|
||||
|
||||
````sh
|
||||
|
@ -89,6 +89,31 @@ obsidian-export my-obsidian-vault/some-note.md /tmp/export/
|
|||
obsidian-export my-obsidian-vault/some-note.md /tmp/exported-note.md
|
||||
````
|
||||
|
||||
Note that in this mode, obsidian-export sees `some-note.md` as being the only file that exists in your vault so references to other notes won't be resolved.
|
||||
This is by design.
|
||||
|
||||
If you'd like to export a single note while resolving links or embeds to other areas in your vault then you should instead specify the root of your vault as the source, passing the file you'd like to export with `--start-at`, as described in the next section.
|
||||
|
||||
### Exporting a partial vault
|
||||
|
||||
Using the `--start-at` argument, you can export just a subset of your vault.
|
||||
Given the following vault structure:
|
||||
|
||||
````
|
||||
my-obsidian-vault
|
||||
├── Notes/
|
||||
├── Books/
|
||||
└── People/
|
||||
````
|
||||
|
||||
This will export only the notes in the `Books` directory to `exported-notes`:
|
||||
|
||||
````sh
|
||||
obsidian-export my-obsidian-vault --start-at my-obsidian-vault/Books exported-notes
|
||||
````
|
||||
|
||||
In this mode, all notes under the source (the first argument) are considered part of the vault so any references to these files will remain intact, even if they're not part of the exported notes.
|
||||
|
||||
## Character encodings
|
||||
|
||||
At present, UTF-8 character encoding is assumed for all note text as well as filenames.
|
||||
|
@ -112,9 +137,14 @@ To completely remove any frontmatter from exported notes, use `--frontmatter=nev
|
|||
|
||||
## Ignoring files
|
||||
|
||||
By default, hidden files, patterns listed in `.export-ignore` as well as any files ignored by git (if your vault is part of a git repository) will be excluded from exports.
|
||||
The following files are not exported by default:
|
||||
|
||||
* hidden files (can be adjusted with `--hidden`)
|
||||
* files matching a pattern listed in `.export-ignore` (can be adjusted with `--ignore-file`)
|
||||
* any files that are ignored by git (can be adjusted with `--no-git`)
|
||||
* using `--skip-tags foo --skip-tags bar` will skip any files that have the tags `foo` or `bar` in their frontmatter
|
||||
* using `--only-tags foo --only-tags bar` will skip any files that **don't** have the tags `foo` or `bar` in their frontmatter
|
||||
|
||||
These options may be adjusted with `--hidden`, `--ignore-file` and `--no-git` if desired.
|
||||
(See `--help` for more information).
|
||||
|
||||
Notes linking to ignored notes will be unlinked (they'll only include the link text).
|
||||
|
@ -122,7 +152,7 @@ Embeds of ignored notes will be skipped entirely.
|
|||
|
||||
### Ignorefile syntax
|
||||
|
||||
The syntax for `.export-ignore` files is identical to that of [gitignore](https://git-scm.com/docs/gitignore) files.
|
||||
The syntax for `.export-ignore` files is identical to that of [gitignore] files.
|
||||
Here's an example:
|
||||
|
||||
````
|
||||
|
@ -136,7 +166,7 @@ test
|
|||
!special.pdf
|
||||
````
|
||||
|
||||
For more comprehensive documentation and examples, see the [gitignore](https://git-scm.com/docs/gitignore) manpage.
|
||||
For more comprehensive documentation and examples, see the [gitignore] manpage.
|
||||
|
||||
## Recursive embeds
|
||||
|
||||
|
@ -150,12 +180,12 @@ Using this mode, if a note is encountered for a second time while processing the
|
|||
|
||||
## Relative links with Hugo
|
||||
|
||||
The [Hugo](https://gohugo.io) static site generator [does not support relative links to files](https://notes.nick.groenen.me/notes/relative-linking-in-hugo/).
|
||||
Instead, it expects you to link to other pages using the [`ref` and `relref` shortcodes](https://gohugo.io/content-management/cross-references/).
|
||||
The [Hugo] static site generator [does not support relative links to files](https://notes.nick.groenen.me/notes/relative-linking-in-hugo/).
|
||||
Instead, it expects you to link to other pages using the [`ref` and `relref` shortcodes].
|
||||
|
||||
As a result of this, notes that have been exported from Obsidian using obsidian-export do not work out of the box because Hugo doesn't resolve these links correctly.
|
||||
|
||||
[Markdown Render Hooks](https://gohugo.io/getting-started/configuration-markup#markdown-render-hooks) (only supported using the default `goldmark` renderer) allow you to work around this issue however, making exported notes work with Hugo after a bit of one-time setup work.
|
||||
[Markdown Render Hooks] (only supported using the default `goldmark` renderer) allow you to work around this issue however, making exported notes work with Hugo after a bit of one-time setup work.
|
||||
|
||||
Create the file `layouts/_default/_markup/render-link.html` with the following contents:
|
||||
|
||||
|
@ -217,298 +247,30 @@ All of the functionality exposed by the `obsidian-export` CLI command is also ac
|
|||
To get started, visit the library documentation on [obsidian_export](https://docs.rs/obsidian-export/latest/obsidian_export/) and [obsidian_export::Exporter](https://docs.rs/obsidian-export/latest/obsidian_export/struct.Exporter.html).
|
||||
|
||||
|
||||
# Contributing
|
||||
|
||||
I will happily accept bug fixes as well as enhancements, as long as they align with the overall scope and vision of the project.
|
||||
Please see [CONTRIBUTING](CONTRIBUTING.md) for more information.
|
||||
|
||||
|
||||
# License
|
||||
|
||||
Obsidian-export is dual-licensed under the [Apache 2.0](https://github.com/zoni/obsidian-export/blob/master/LICENSE-APACHE) and the [MIT](https://github.com/zoni/obsidian-export/blob/master/LICENSE-MIT) licenses.
|
||||
Obsidian-export is open-source software released under the [BSD-2-Clause Plus Patent License].
|
||||
This license is designed to provide: a) a simple permissive license; b) that is compatible with the GNU General Public License (GPL), version 2; and c) which also has an express patent grant included.
|
||||
|
||||
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this project by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
|
||||
Please review the [LICENSE] file for the full text of the license.
|
||||
|
||||
|
||||
# Changelog
|
||||
|
||||
## v0.7.0 (2021-04-11)
|
||||
For a list of releases and the changes with each version, please refer to the [CHANGELOG](CHANGELOG.md).
|
||||
|
||||
### New
|
||||
|
||||
* Postprocessing support. \[Nick Groenen]
|
||||
|
||||
Add support for postprocessing of Markdown prior to writing converted
|
||||
notes to disk.
|
||||
|
||||
Postprocessors may be used when making use of Obsidian export as a Rust
|
||||
library to do the following:
|
||||
|
||||
1. Modify a note's `Context`, for example to change the destination
|
||||
filename or update its Frontmatter.
|
||||
1. Change a note's contents by altering `MarkdownEvents`.
|
||||
1. Prevent later postprocessors from running or cause a note to be
|
||||
skipped entirely.
|
||||
Future releases of Obsidian export may come with built-in postprocessors
|
||||
for users of the command-line tool to use, if general use-cases can be
|
||||
identified.
|
||||
|
||||
For example, a future release might include functionality to make notes
|
||||
more suitable for the Hugo static site generator. This functionality
|
||||
would be implemented as a postprocessor that could be enabled through
|
||||
command-line flags.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Also percent-encode `?` in filenames. \[Nick Groenen]
|
||||
|
||||
A recent Obsidian update expanded the list of allowed characters in
|
||||
filenames, which now includes `?` as well. This needs to be
|
||||
percent-encoded for proper links in static site generators like Hugo.
|
||||
|
||||
### Other
|
||||
|
||||
* Bump pretty_assertions from 0.6.1 to 0.7.1. \[dependabot\[bot]]
|
||||
|
||||
Bumps [pretty_assertions](https://github.com/colin-kiegel/rust-pretty-assertions) from 0.6.1 to 0.7.1.
|
||||
|
||||
* [Release notes](https://github.com/colin-kiegel/rust-pretty-assertions/releases)
|
||||
* [Changelog](https://github.com/colin-kiegel/rust-pretty-assertions/blob/main/CHANGELOG.md)
|
||||
* [Commits](https://github.com/colin-kiegel/rust-pretty-assertions/compare/v0.6.1...v0.7.1)
|
||||
* Bump walkdir from 2.3.1 to 2.3.2. \[dependabot\[bot]]
|
||||
|
||||
Bumps [walkdir](https://github.com/BurntSushi/walkdir) from 2.3.1 to 2.3.2.
|
||||
|
||||
* [Release notes](https://github.com/BurntSushi/walkdir/releases)
|
||||
* [Commits](https://github.com/BurntSushi/walkdir/compare/2.3.1...2.3.2)
|
||||
* Bump regex from 1.4.3 to 1.4.5. \[dependabot\[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.3 to 1.4.5.
|
||||
|
||||
* [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
* [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
* [Commits](https://github.com/rust-lang/regex/compare/1.4.3...1.4.5)
|
||||
|
||||
## v0.6.0 (2021-02-15)
|
||||
|
||||
### New
|
||||
|
||||
* Add `--version` flag. \[Nick Groenen]
|
||||
|
||||
### Changes
|
||||
|
||||
* Don't Box FilterFn in WalkOptions. \[Nick Groenen]
|
||||
|
||||
Previously, `filter_fn` on the `WalkOptions` struct looked like:
|
||||
|
||||
````
|
||||
pub filter_fn: Option<Box<&'static FilterFn>>,
|
||||
````
|
||||
|
||||
This boxing was unneccesary and has been changed to:
|
||||
|
||||
````
|
||||
pub filter_fn: Option<&'static FilterFn>,
|
||||
````
|
||||
|
||||
This will only affect people who use obsidian-export as a library in
|
||||
other Rust programs, not users of the CLI.
|
||||
|
||||
For those library users, they no longer need to supply `FilterFn`
|
||||
wrapped in a Box.
|
||||
|
||||
### Fixes
|
||||
|
||||
* Recognize notes beginning with underscores. \[Nick Groenen]
|
||||
|
||||
Notes with an underscore would fail to be recognized within Obsidian
|
||||
`[[_WikiLinks]]` due to the assumption that the underlying Markdown
|
||||
parser (pulldown_cmark) would emit the text between `[[` and `]]` as
|
||||
a single event.
|
||||
|
||||
The note parser has now been rewritten to use a more reliable state
|
||||
machine which correctly recognizes this corner-case (and likely some
|
||||
others).
|
||||
|
||||
* Support self-references. \[Joshua Coles]
|
||||
|
||||
This ensures links to headings within the same note (`[[#Heading]]`)
|
||||
resolve correctly.
|
||||
|
||||
### Other
|
||||
|
||||
* Avoid redundant "Release" in GitHub release titles. \[Nick Groenen]
|
||||
|
||||
* Add failing testcase for files with underscores. \[Nick Groenen]
|
||||
|
||||
* Add unit tests for display of ObsidianNoteReference. \[Nick Groenen]
|
||||
|
||||
* Add some unit tests for ObsidianNoteReference::from_str. \[Nick Groenen]
|
||||
|
||||
* Also run tests on pull requests. \[Nick Groenen]
|
||||
|
||||
* Apply clippy suggestions following rust 1.50.0. \[Nick Groenen]
|
||||
|
||||
* Fix infinite recursion bug with references to current file. \[Joshua Coles]
|
||||
|
||||
* Add tests for self-references. \[Joshua Coles]
|
||||
|
||||
Note as there is no support for block references at the moment, the generated link goes nowhere, however it is to a reasonable ID
|
||||
|
||||
* Bump tempfile from 3.1.0 to 3.2.0. \[dependabot\[bot]]
|
||||
|
||||
Bumps [tempfile](https://github.com/Stebalien/tempfile) from 3.1.0 to 3.2.0.
|
||||
|
||||
* [Release notes](https://github.com/Stebalien/tempfile/releases)
|
||||
* [Changelog](https://github.com/Stebalien/tempfile/blob/master/NEWS)
|
||||
* [Commits](https://github.com/Stebalien/tempfile/commits)
|
||||
* Bump eyre from 0.6.3 to 0.6.5. \[dependabot\[bot]]
|
||||
|
||||
Bumps [eyre](https://github.com/yaahc/eyre) from 0.6.3 to 0.6.5.
|
||||
|
||||
* [Release notes](https://github.com/yaahc/eyre/releases)
|
||||
* [Changelog](https://github.com/yaahc/eyre/blob/v0.6.5/CHANGELOG.md)
|
||||
* [Commits](https://github.com/yaahc/eyre/compare/v0.6.3...v0.6.5)
|
||||
* Bump regex from 1.4.2 to 1.4.3. \[dependabot\[bot]]
|
||||
|
||||
Bumps [regex](https://github.com/rust-lang/regex) from 1.4.2 to 1.4.3.
|
||||
|
||||
* [Release notes](https://github.com/rust-lang/regex/releases)
|
||||
* [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
|
||||
* [Commits](https://github.com/rust-lang/regex/compare/1.4.2...1.4.3)
|
||||
|
||||
## v0.5.1 (2021-01-10)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Find uppercased notes when referenced with lowercase. \[Nick Groenen]
|
||||
|
||||
This commit fixes a bug where, if a note contained uppercase characters
|
||||
(for example `Note.md`) but was referred to using lowercase
|
||||
(`[[note]]`), that note would not be found.
|
||||
|
||||
## v0.5.0 (2021-01-05)
|
||||
|
||||
### New
|
||||
|
||||
* Add --no-recursive-embeds to break infinite recursion cycles. \[Nick Groenen]
|
||||
|
||||
It's possible to end up with "recursive embeds" when two notes embed
|
||||
each other. This happens for example when a `Note A.md` contains
|
||||
`![[Note B]]` but `Note B.md` also contains `![[Note A]]`.
|
||||
|
||||
By default, this will trigger an error and display the chain of notes
|
||||
which caused the recursion.
|
||||
|
||||
Using the new `--no-recursive-embeds`, if a note is encountered for a
|
||||
second time while processing the original note, rather than embedding it
|
||||
again a link to the note is inserted instead to break the cycle.
|
||||
|
||||
See also: https://github.com/zoni/obsidian-export/issues/1
|
||||
|
||||
* Make walk options configurable on CLI. \[Nick Groenen]
|
||||
|
||||
By default hidden files, patterns listed in `.export-ignore` as well as
|
||||
any files ignored by git are excluded from exports. This behavior has
|
||||
been made configurable on the CLI using the new flags `--hidden`,
|
||||
`--ignore-file` and `--no-git`.
|
||||
|
||||
* Support links referencing headings. \[Nick Groenen]
|
||||
|
||||
Previously, links referencing a heading (`[[note#heading]]`) would just
|
||||
link to the file name without including an anchor in the link target.
|
||||
Now, such references will include an appropriate `#anchor` attribute.
|
||||
|
||||
Note that neither the original Markdown specification, nor the more
|
||||
recent CommonMark standard, specify how anchors should be constructed
|
||||
for a given heading.
|
||||
|
||||
There are also some differences between the various Markdown rendering
|
||||
implementations.
|
||||
|
||||
Obsidian-export uses the [slug](https://crates.io/crates/slug) crate to generate anchors which should
|
||||
be compatible with most implementations, however your mileage may vary.
|
||||
|
||||
(For example, GitHub may leave a trailing `-` on anchors when headings
|
||||
end with a smiley. The slug library, and thus obsidian-export, will
|
||||
avoid such dangling dashes).
|
||||
|
||||
* Support embeds referencing headings. \[Nick Groenen]
|
||||
|
||||
Previously, partial embeds (`![[note#heading]]`) would always include
|
||||
the entire file into the source note. Now, such embeds will only include
|
||||
the contents of the referenced heading (and any subheadings).
|
||||
|
||||
Links and embeds of [arbitrary blocks](https://publish.obsidian.md/help/How+to/Link+to+blocks) remains unsupported at this time.
|
||||
|
||||
### Changes
|
||||
|
||||
* Print warnings to stderr rather than stdout. \[Nick Groenen]
|
||||
|
||||
Warning messages emitted when encountering broken links/references will
|
||||
now be printed to stderr as opposed to stdout.
|
||||
|
||||
### Other
|
||||
|
||||
* Include filter_fn field in WalkOptions debug display. \[Nick Groenen]
|
||||
|
||||
## v0.4.0 (2020-12-23)
|
||||
|
||||
### Fixes
|
||||
|
||||
* Correct relative links within embedded notes. \[Nick Groenen]
|
||||
|
||||
Links within an embedded note would point to other local resources
|
||||
relative to the filesystem location of the note being embedded.
|
||||
|
||||
When a note inside a different directory would embed such a note, these
|
||||
links would point to invalid locations.
|
||||
|
||||
Now these links are calculated relative to the top note, which ensures
|
||||
these links will point to the right path.
|
||||
|
||||
### Other
|
||||
|
||||
* Add brief library documentation to all public types and functions. \[Nick Groenen]
|
||||
|
||||
## v0.3.0 (2020-12-21)
|
||||
|
||||
### New
|
||||
|
||||
* Report file tree when RecursionLimitExceeded is hit. \[Nick Groenen]
|
||||
|
||||
This refactors the Context to maintain a list of all the files which
|
||||
have been processed so far in a chain of embeds. This information is
|
||||
then used to print a more helpful error message to users of the CLI when
|
||||
RecursionLimitExceeded is returned.
|
||||
|
||||
### Changes
|
||||
|
||||
* Add extra whitespace around multi-line warnings. \[Nick Groenen]
|
||||
|
||||
This makes errors a bit easier to distinguish after a number of warnings
|
||||
has been printed.
|
||||
|
||||
### Other
|
||||
|
||||
* Setup gitchangelog. \[Nick Groenen]
|
||||
|
||||
This adds a changelog (CHANGES.md) which is automatically generated with
|
||||
[gitchangelog](https://github.com/vaab/gitchangelog).
|
||||
|
||||
## v0.2.0 (2020-12-13)
|
||||
|
||||
* Allow custom filter function to be passed with WalkOptions. \[Nick Groenen]
|
||||
|
||||
* Re-export vault_contents and WalkOptions as pub from crate root. \[Nick Groenen]
|
||||
|
||||
* Run mdbook hook against README.md too. \[Nick Groenen]
|
||||
|
||||
* Update installation instructions. \[Nick Groenen]
|
||||
|
||||
Installation no longer requires a git repository URL now that a crate is
|
||||
published.
|
||||
|
||||
* Add MdBook generation script and precommit hook. \[Nick Groenen]
|
||||
|
||||
* Add more reliable non-ASCII tetscase. \[Nick Groenen]
|
||||
|
||||
* Create FUNDING.yml. \[Nick Groenen]
|
||||
|
||||
## v0.1.0 (2020-11-28)
|
||||
|
||||
* Public release. \[Nick Groenen]
|
||||
[Obsidian]: https://obsidian.md/
|
||||
[CommonMark]: https://commonmark.org/
|
||||
[gitignore]: https://git-scm.com/docs/gitignore
|
||||
[Cargo]: https://doc.rust-lang.org/cargo/
|
||||
[Hugo]: https://gohugo.io
|
||||
[`ref` and `relref` shortcodes]: https://gohugo.io/content-management/cross-references/
|
||||
[Markdown Render Hooks]: https://gohugo.io/getting-started/configuration-markup#markdown-render-hooks
|
||||
[BSD-2-Clause Plus Patent License]: https://spdx.org/licenses/BSD-2-Clause-Patent.html
|
||||
[LICENSE]: LICENSE
|
||||
|
|
|
@ -0,0 +1,70 @@
|
|||
# https://git-cliff.org/docs/configuration
|
||||
|
||||
[changelog]
|
||||
# changelog header
|
||||
header = """
|
||||
# Changelog\n
|
||||
"""
|
||||
# template for the changelog body
|
||||
# https://keats.github.io/tera/docs/#introduction
|
||||
body = """
|
||||
{% if version -%}\
|
||||
## {{ version }} ({{ timestamp | date(format="%Y-%m-%d") }})
|
||||
{% else -%}\
|
||||
## [unreleased]
|
||||
{% endif %}\
|
||||
{% set grouped_commits = commits | group_by(attribute="group") %}\
|
||||
{% set_global groups = [] %}\
|
||||
{% for group, commits in grouped_commits -%}\
|
||||
{% set_global groups = groups | concat(with=group) %}\
|
||||
{% endfor -%}\
|
||||
{% for group in groups | sort %}
|
||||
### {{ group | split(pat=" ") | slice(start=1) | join(sep=" ") | upper_first }}
|
||||
{% for commit in grouped_commits[group] -%}
|
||||
{% for line in commit.message | split(pat="\\n") -%}\
|
||||
{% if loop.first %}
|
||||
- {{ line | upper_first }} [{{ commit.author.name }}]{% else %} {{ line | trim }} {% endif %}
|
||||
{% endfor -%}
|
||||
{% endfor -%}
|
||||
{% endfor %}\n
|
||||
"""
|
||||
trim = false
|
||||
footer = """
|
||||
<!-- generated by git-cliff -->
|
||||
"""
|
||||
postprocessors = [
|
||||
{ pattern = "- ([Nn]ew|[Ff]ix|[Cc]hg):\\s*(.*)", replace = "- $2" },
|
||||
]
|
||||
|
||||
[git]
|
||||
# parse the commits based on https://www.conventionalcommits.org
|
||||
conventional_commits = false
|
||||
# filter out the commits that are not conventional
|
||||
filter_unconventional = false
|
||||
# process each line of a commit as an individual commit
|
||||
split_commits = false
|
||||
# regex for preprocessing the commit messages
|
||||
commit_preprocessors = [
|
||||
# { pattern = '\((\w+\s)?#([0-9]+)\)', replace = "([#${2}](<REPO>/issues/${2}))"}, # replace issue numbers
|
||||
]
|
||||
# regex for parsing and grouping commits
|
||||
commit_parsers = [
|
||||
{ body = "!skip_changelog", skip = true },
|
||||
{ message = "^[Nn]ew:\\s*(.*)", group = "01. New" },
|
||||
{ message = "^[Ff]ix:\\s*(.*)", group = "02. Fixes" },
|
||||
{ message = "^[Cc]hg:\\s*(.*)", group = "03. Changes" },
|
||||
{ field = "author.name", pattern = "(dependabot|renovate)\\[bot\\]", skip = true },
|
||||
{ message = ".*", group = "10. Other" },
|
||||
]
|
||||
|
||||
# protect breaking changes from being skipped due to matching a skipping commit_parser
|
||||
protect_breaking_commits = false
|
||||
# filter out the commits that are not matched by commit parsers
|
||||
filter_commits = false
|
||||
# regex for matching git tags
|
||||
tag_pattern = "v[0-9].*"
|
||||
|
||||
# sort the tags topologically
|
||||
topo_order = false
|
||||
# sort the commits inside sections by oldest/newest order
|
||||
sort_commits = "oldest"
|
|
@ -1 +1 @@
|
|||
{"theme":"moonstone","pluginEnabledStatus":{"Open in default app":true,"file-explorer":true,"global-search":true,"switcher":true,"graph":true,"backlink":true,"command-palette":true,"markdown-importer":false,"word-count":true,"tag-pane":false,"daily-notes":false,"slides":false,"open-with-default-app":true,"random-note":false,"page-preview":true,"zk-prefixer":false,"starred":false,"outline":true,"templates":false,"workspaces":false},"isSidebarCollapsed":false,"isRightSidedockCollapsed":true,"lastOpenSidebarTab":"File explorer","useTab":false,"showLineNumber":true,"foldHeading":true,"foldIndent":true,"vimMode":true,"newFileLocation":"current","hotkeys":{"switcher:open":[{"modifiers":["Mod"],"key":" "}],"app:go-back":[{"modifiers":["Mod"],"key":"o"}],"app:go-forward":[{"modifiers":["Mod"],"key":"i"}],"editor:toggle-bold":[{"modifiers":["Mod"],"key":"8"}],"editor:toggle-italics":[{"modifiers":["Mod"],"key":"-"}],"editor:toggle-highlight":[{"modifiers":["Mod"],"key":"="}],"editor:delete-paragraph":[],"editor:focus-top":[{"modifiers":["Alt"],"key":"k"}],"editor:focus-bottom":[{"modifiers":["Alt"],"key":"j"}],"editor:focus-left":[{"modifiers":["Alt"],"key":"h"}],"editor:focus-right":[{"modifiers":["Alt"],"key":"l"}],"workspace:split-horizontal":[{"modifiers":["Alt"],"key":"s"}],"workspace:split-vertical":[{"modifiers":["Alt"],"key":"v"}],"workspace:toggle-pin":[{"modifiers":["Alt"],"key":"t"}],"graph:open":[],"backlink:open-backlinks":[{"modifiers":["Alt"],"key":"b"}],"workspace:close":[{"modifiers":["Alt"],"key":"w"}],"editor:swap-line-up":[{"modifiers":["Mod","Shift"],"key":"K"}],"editor:swap-line-down":[{"modifiers":["Mod","Shift"],"key":"J"}],"outline:open":[{"modifiers":["Alt"],"key":"o"}],"app:toggle-left-sidebar":[{"modifiers":["Alt"],"key":","}],"app:toggle-right-sidebar":[{"modifiers":["Alt"],"key":"."}],"graph:open-local":[{"modifiers":["Alt"],"key":"g"}]},"lastOpenRightSidedockTab":"Backlinks","sideDockWidth":{"left":300,"right":301},"fileSortOrder":"alphabetical","promptDelete":false,"readableLineLength":true,"alwaysUpdateLinks":true,"spellcheck":true,"strictLineBreaks":true,"spellcheckDictionary":[],"autoPairMarkdown":false,"autoPairBrackets":false,"showFrontmatter":true,"enabledPlugins":["todoist-sync-plugin"],"defaultViewMode":"preview","obsidianCss":false}
|
||||
{"theme":"moonstone","pluginEnabledStatus":{"Open in default app":true,"file-explorer":true,"global-search":true,"switcher":true,"graph":true,"backlink":true,"command-palette":true,"markdown-importer":false,"word-count":true,"tag-pane":false,"daily-notes":false,"slides":false,"open-with-default-app":true,"random-note":false,"page-preview":true,"zk-prefixer":false,"starred":false,"outline":true,"templates":false,"workspaces":false},"isSidebarCollapsed":false,"isRightSidedockCollapsed":true,"lastOpenSidebarTab":"File explorer","useTab":false,"showLineNumber":true,"foldHeading":true,"foldIndent":true,"vimMode":true,"newFileLocation":"current","hotkeys":{"switcher:open":[{"modifiers":["Mod"],"key":" "}],"app:go-back":[{"modifiers":["Mod"],"key":"o"}],"app:go-forward":[{"modifiers":["Mod"],"key":"i"}],"editor:toggle-bold":[{"modifiers":["Mod"],"key":"8"}],"editor:toggle-italics":[{"modifiers":["Mod"],"key":"-"}],"editor:toggle-highlight":[{"modifiers":["Mod"],"key":"="}],"editor:delete-paragraph":[],"editor:focus-top":[{"modifiers":["Alt"],"key":"k"}],"editor:focus-bottom":[{"modifiers":["Alt"],"key":"j"}],"editor:focus-left":[{"modifiers":["Alt"],"key":"h"}],"editor:focus-right":[{"modifiers":["Alt"],"key":"l"}],"workspace:split-horizontal":[{"modifiers":["Alt"],"key":"s"}],"workspace:split-vertical":[{"modifiers":["Alt"],"key":"v"}],"workspace:toggle-pin":[{"modifiers":["Alt"],"key":"t"}],"graph:open":[],"backlink:open-backlinks":[{"modifiers":["Alt"],"key":"b"}],"workspace:close":[{"modifiers":["Alt"],"key":"w"}],"editor:swap-line-up":[{"modifiers":["Mod","Shift"],"key":"K"}],"editor:swap-line-down":[{"modifiers":["Mod","Shift"],"key":"J"}],"outline:open":[{"modifiers":["Alt"],"key":"o"}],"app:toggle-left-sidebar":[{"modifiers":["Alt"],"key":","}],"app:toggle-right-sidebar":[{"modifiers":["Alt"],"key":"."}],"graph:open-local":[{"modifiers":["Alt"],"key":"g"}]},"lastOpenRightSidedockTab":"Backlinks","sideDockWidth":{"left":300,"right":301},"fileSortOrder":"alphabetical","promptDelete":false,"readableLineLength":true,"alwaysUpdateLinks":true,"spellcheck":true,"strictLineBreaks":true,"spellcheckDictionary":[],"autoPairMarkdown":false,"autoPairBrackets":false,"showFrontmatter":true,"enabledPlugins":["todoist-sync-plugin"],"defaultViewMode":"preview","obsidianCss":false}
|
||||
|
|
|
@ -107,4 +107,4 @@
|
|||
"Usage.md",
|
||||
"License.md"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1 +1 @@
|
|||
{}
|
||||
{}
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
../CHANGELOG.md
|
|
@ -1 +0,0 @@
|
|||
../CHANGES.md
|
|
@ -0,0 +1 @@
|
|||
../CONTRIBUTING.md
|
|
@ -1,14 +1,5 @@
|
|||
# Release process
|
||||
|
||||
- [ ] Update version number in `Cargo.toml`
|
||||
- [ ] Run `cargo check`
|
||||
- [ ] Commit changes to `Cargo.*` with the message format `Release vN.N.N`
|
||||
- [ ] Make git tag `vN.N.N`
|
||||
- [ ] Run `gitchangelog`, review and make any manual adjustments as needed
|
||||
- [ ] Regenerate README: `docs/generate.sh`
|
||||
- [ ] Stage `CHANGES.md`, `README.md` and amend previous commit
|
||||
- [ ] Force update git tag `vN.N.N`
|
||||
- [ ] Push changes & tag
|
||||
- [ ] Wait for builds to turn green (<https://github.com/zoni/obsidian-export/actions>)
|
||||
- [ ] Run `cargo publish`
|
||||
- [ ] Publish drafted release (<https://github.com/zoni/obsidian-export/releases>)
|
||||
- [ ] Run `./make-new-release.sh`
|
||||
- [ ] Push the created release commit/tag to GitHub
|
||||
- [ ] Wait for builds to turn green (<https://github.com/zoni/obsidian-export/actions>) and confirm everything looks OK.
|
||||
|
|
|
@ -4,5 +4,6 @@
|
|||
![[usage-basic]]
|
||||
![[usage-advanced]]
|
||||
![[usage-library]]
|
||||
![[contribute]]
|
||||
![[license]]
|
||||
![[CHANGES]]
|
||||
![[changes]]
|
||||
|
|
|
@ -6,7 +6,7 @@ WARNING:
|
|||
the docs directory.
|
||||
|
||||
Instead of editing README.md, edit the corresponding Markdown files in the
|
||||
docs directory and run generate.sh.
|
||||
docs directory and run generate.sh.
|
||||
|
||||
To add new sections, create new files under docs and add these to _combined.md
|
||||
|
||||
|
|
|
@ -0,0 +1,3 @@
|
|||
# Changelog
|
||||
|
||||
For a list of releases and the changes with each version, please refer to the [[CHANGELOG]].
|
|
@ -0,0 +1,4 @@
|
|||
# Contributing
|
||||
|
||||
I will happily accept bug fixes as well as enhancements, as long as they align with the overall scope and vision of the project.
|
||||
Please see [CONTRIBUTING](CONTRIBUTING.md) for more information.
|
|
@ -1,4 +1,4 @@
|
|||
#!/bin/sh
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
|
|
|
@ -2,10 +2,9 @@
|
|||
|
||||
## Pre-built binaries
|
||||
|
||||
Binary releases for x86-64 processors are provided for Windows, Linux and Mac operating systems on a best-effort basis.
|
||||
They are built with GitHub runners as part of the release workflow defined in `.github/workflows/release.yml`.
|
||||
Pre-compiled binaries for all major platforms are available at <https://github.com/zoni/obsidian-export/releases>
|
||||
|
||||
The resulting binaries can be downloaded from <https://github.com/zoni/obsidian-export/releases>
|
||||
In addition to the installation scripts provided, these releases are also suitable for [installation with cargo-binstall][cargo-binstall].
|
||||
|
||||
## Building from source
|
||||
|
||||
|
@ -24,3 +23,4 @@ If you downloaded a pre-built binary, upgrade by downloading the latest version
|
|||
If you built from source, upgrade by running `cargo install obsidian-export` again.
|
||||
|
||||
[Cargo]: https://doc.rust-lang.org/cargo/
|
||||
[cargo-binstall]: https://github.com/cargo-bins/cargo-binstall#readme
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
# License
|
||||
|
||||
Obsidian-export is dual-licensed under the [Apache 2.0] and the [MIT] licenses.
|
||||
Obsidian-export is open-source software released under the [BSD-2-Clause Plus Patent License].
|
||||
This license is designed to provide: a) a simple permissive license; b) that is compatible with the GNU General Public License (GPL), version 2; and c) which also has an express patent grant included.
|
||||
|
||||
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this project by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
|
||||
Please review the [LICENSE] file for the full text of the license.
|
||||
|
||||
[Apache 2.0]: https://github.com/zoni/obsidian-export/blob/master/LICENSE-APACHE
|
||||
[MIT]: https://github.com/zoni/obsidian-export/blob/master/LICENSE-MIT
|
||||
[BSD-2-Clause Plus Patent License]: https://spdx.org/licenses/BSD-2-Clause-Patent.html
|
||||
[LICENSE]: LICENSE
|
||||
|
|
|
@ -12,9 +12,14 @@ To completely remove any frontmatter from exported notes, use `--frontmatter=nev
|
|||
|
||||
## Ignoring files
|
||||
|
||||
By default, hidden files, patterns listed in `.export-ignore` as well as any files ignored by git (if your vault is part of a git repository) will be excluded from exports.
|
||||
The following files are not exported by default:
|
||||
|
||||
* hidden files (can be adjusted with `--hidden`)
|
||||
* files matching a pattern listed in `.export-ignore` (can be adjusted with `--ignore-file`)
|
||||
* any files that are ignored by git (can be adjusted with `--no-git`)
|
||||
* using `--skip-tags foo --skip-tags bar` will skip any files that have the tags `foo` or `bar` in their frontmatter
|
||||
* using `--only-tags foo --only-tags bar` will skip any files that **don't** have the tags `foo` or `bar` in their frontmatter
|
||||
|
||||
These options may be adjusted with `--hidden`, `--ignore-file` and `--no-git` if desired.
|
||||
(See `--help` for more information).
|
||||
|
||||
Notes linking to ignored notes will be unlinked (they'll only include the link text).
|
||||
|
@ -112,4 +117,4 @@ With these hooks in place, links to both notes as well as file attachments shoul
|
|||
[gitignore]: https://git-scm.com/docs/gitignore
|
||||
[hugo-relative-linking]: https://notes.nick.groenen.me/notes/relative-linking-in-hugo/
|
||||
[hugo]: https://gohugo.io
|
||||
[markdown render hooks]: https://gohugo.io/getting-started/configuration-markup#markdown-render-hooks
|
||||
[markdown render hooks]: https://gohugo.io/getting-started/configuration-markup#markdown-render-hooks
|
||||
|
|
|
@ -3,13 +3,15 @@
|
|||
The main interface of _obsidian-export_ is the `obsidian-export` CLI command.
|
||||
As a text interface, this must be run from a terminal or Windows PowerShell.
|
||||
|
||||
It is assumed that you have basic familiarity with command-line interfaces and that you set up your `PATH` correctly if you installed with `cargo`.
|
||||
It is assumed that you have basic familiarity with command-line interfaces and that you set up your `PATH` correctly if you installed with `cargo`.
|
||||
Running `obsidian-export --version` should print a version number rather than giving some kind of error.
|
||||
|
||||
> If you downloaded a pre-built binary and didn't put it a location referenced by `PATH` (for example, you put it in `Downloads`), you will need to provide the full path to the binary instead.
|
||||
>
|
||||
> For example `~/Downloads/obsidian-export --version` on Mac/Linux or `~\Downloads\obsidian-export --version` on Windows (PowerShell).
|
||||
|
||||
## Exporting notes
|
||||
|
||||
In it's most basic form, `obsidian-export` takes just two mandatory arguments, a source and a destination:
|
||||
|
||||
```sh
|
||||
|
@ -31,6 +33,31 @@ obsidian-export my-obsidian-vault/some-note.md /tmp/export/
|
|||
obsidian-export my-obsidian-vault/some-note.md /tmp/exported-note.md
|
||||
```
|
||||
|
||||
Note that in this mode, obsidian-export sees `some-note.md` as being the only file that exists in your vault so references to other notes won't be resolved.
|
||||
This is by design.
|
||||
|
||||
If you'd like to export a single note while resolving links or embeds to other areas in your vault then you should instead specify the root of your vault as the source, passing the file you'd like to export with `--start-at`, as described in the next section.
|
||||
|
||||
### Exporting a partial vault
|
||||
|
||||
Using the `--start-at` argument, you can export just a subset of your vault.
|
||||
Given the following vault structure:
|
||||
|
||||
```
|
||||
my-obsidian-vault
|
||||
├── Notes/
|
||||
├── Books/
|
||||
└── People/
|
||||
```
|
||||
|
||||
This will export only the notes in the `Books` directory to `exported-notes`:
|
||||
|
||||
```sh
|
||||
obsidian-export my-obsidian-vault --start-at my-obsidian-vault/Books exported-notes
|
||||
```
|
||||
|
||||
In this mode, all notes under the source (the first argument) are considered part of the vault so any references to these files will remain intact, even if they're not part of the exported notes.
|
||||
|
||||
## Character encodings
|
||||
|
||||
At present, UTF-8 character encoding is assumed for all note text as well as filenames.
|
||||
|
|
|
@ -0,0 +1,47 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
get_next_version_number() {
|
||||
DATEPART=$(date +%y.%-m)
|
||||
ITERATION=0
|
||||
|
||||
while true; do
|
||||
VERSION_STRING="${DATEPART}.${ITERATION}"
|
||||
if git rev-list "v$VERSION_STRING" > /dev/null 2>&1; then
|
||||
((ITERATION++))
|
||||
else
|
||||
echo "$VERSION_STRING"
|
||||
return
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
git add .
|
||||
if ! git diff-index --quiet HEAD; then
|
||||
printf "Working directory is not clean. Please commit or stash your changes.\n"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
VERSION=$(get_next_version_number)
|
||||
git tag "v${VERSION}"
|
||||
|
||||
git cliff --latest --prepend CHANGELOG.md > /dev/null
|
||||
${EDITOR:-vim} CHANGELOG.md
|
||||
docs/generate.sh
|
||||
|
||||
sed -i -E "s/^version = \".+\"$/version = \"${VERSION}\"/" Cargo.toml
|
||||
cargo check
|
||||
|
||||
git add .
|
||||
# There are likely trailing whitespace changes in the changelog, but a single
|
||||
# run of pre-commit will fix these automatically.
|
||||
pre-commit run || git add .
|
||||
|
||||
git commit --message "Release v${VERSION}"
|
||||
git tag "v${VERSION}" --force
|
||||
|
||||
printf "\n\nSuccessfully created release %s\n" "v${VERSION}"
|
||||
printf "\nYou'll probably want to continue with:\n"
|
||||
printf "\tgit push origin main\n"
|
||||
printf "\tgit push origin %s\n" "v${VERSION}"
|
|
@ -1,5 +1,5 @@
|
|||
use crate::Frontmatter;
|
||||
use std::path::PathBuf;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
/// Context holds metadata about a note which is being parsed.
|
||||
|
@ -55,7 +55,7 @@ impl Context {
|
|||
}
|
||||
|
||||
/// Create a new `Context` which inherits from a parent Context.
|
||||
pub fn from_parent(context: &Context, child: &PathBuf) -> Context {
|
||||
pub fn from_parent(context: &Context, child: &Path) -> Context {
|
||||
let mut context = context.clone();
|
||||
context.file_tree.push(child.to_path_buf());
|
||||
context
|
||||
|
|
|
@ -40,6 +40,7 @@ pub fn frontmatter_to_str(frontmatter: Frontmatter) -> Result<String> {
|
|||
}
|
||||
|
||||
let mut buffer = String::new();
|
||||
buffer.push_str("---\n");
|
||||
buffer.push_str(&serde_yaml::to_string(&frontmatter)?);
|
||||
buffer.push_str("---\n");
|
||||
Ok(buffer)
|
||||
|
|
314
src/lib.rs
314
src/lib.rs
|
@ -6,6 +6,7 @@ extern crate lazy_static;
|
|||
|
||||
mod context;
|
||||
mod frontmatter;
|
||||
pub mod postprocessors;
|
||||
mod references;
|
||||
mod walker;
|
||||
|
||||
|
@ -16,7 +17,7 @@ pub use walker::{vault_contents, WalkOptions};
|
|||
use frontmatter::{frontmatter_from_str, frontmatter_to_str};
|
||||
use pathdiff::diff_paths;
|
||||
use percent_encoding::{utf8_percent_encode, AsciiSet, CONTROLS};
|
||||
use pulldown_cmark::{CodeBlockKind, CowStr, Event, Options, Parser, Tag};
|
||||
use pulldown_cmark::{CodeBlockKind, CowStr, Event, HeadingLevel, Options, Parser, Tag};
|
||||
use pulldown_cmark_to_cmark::cmark_with_options;
|
||||
use rayon::prelude::*;
|
||||
use references::*;
|
||||
|
@ -29,6 +30,7 @@ use std::io::prelude::*;
|
|||
use std::io::ErrorKind;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::str;
|
||||
use unicode_normalization::UnicodeNormalization;
|
||||
|
||||
/// A series of markdown [Event]s that are generated while traversing an Obsidian markdown note.
|
||||
pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
||||
|
@ -45,6 +47,26 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// 3. Prevent later postprocessors from running ([PostprocessorResult::StopHere]) or cause a note
|
||||
/// to be skipped entirely ([PostprocessorResult::StopAndSkipNote]).
|
||||
///
|
||||
/// # Postprocessors and embeds
|
||||
///
|
||||
/// Postprocessors normally run at the end of the export phase, once notes have been fully parsed.
|
||||
/// This means that any embedded notes have been resolved and merged into the final note already.
|
||||
///
|
||||
/// In some cases it may be desirable to change the contents of these embedded notes *before* they
|
||||
/// are inserted into the final document. This is possible through the use of
|
||||
/// [Exporter::add_embed_postprocessor].
|
||||
/// These "embed postprocessors" run much the same way as regular postprocessors, but they're run on
|
||||
/// the note that is about to be embedded in another note. In addition:
|
||||
///
|
||||
/// - Changes to context carry over to later embed postprocessors, but are then discarded. This
|
||||
/// means that changes to frontmatter do not propagate to the root note for example.
|
||||
/// - [PostprocessorResult::StopAndSkipNote] prevents the embedded note from being included (it's
|
||||
/// replaced with a blank document) but doesn't affect the root note.
|
||||
///
|
||||
/// It's possible to pass the same functions to [Exporter::add_postprocessor] and
|
||||
/// [Exporter::add_embed_postprocessor]. The [Context::note_depth] method may be used to determine
|
||||
/// whether a note is a root note or an embedded note in this situation.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
/// ## Update frontmatter
|
||||
|
@ -53,9 +75,8 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// defined inline as a closure.
|
||||
///
|
||||
/// ```
|
||||
/// use obsidian_export::{Context, Exporter, MarkdownEvents, PostprocessorResult};
|
||||
/// use obsidian_export::pulldown_cmark::{CowStr, Event};
|
||||
/// use obsidian_export::serde_yaml::Value;
|
||||
/// use obsidian_export::{Exporter, PostprocessorResult};
|
||||
/// # use std::path::PathBuf;
|
||||
/// # use tempfile::TempDir;
|
||||
///
|
||||
|
@ -65,7 +86,7 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// let mut exporter = Exporter::new(source, destination);
|
||||
///
|
||||
/// // add_postprocessor registers a new postprocessor. In this example we use a closure.
|
||||
/// exporter.add_postprocessor(&|mut context, events| {
|
||||
/// exporter.add_postprocessor(&|context, _events| {
|
||||
/// // This is the key we'll insert into the frontmatter. In this case, the string "foo".
|
||||
/// let key = Value::String("foo".to_string());
|
||||
/// // This is the value we'll insert into the frontmatter. In this case, the string "bar".
|
||||
|
@ -74,9 +95,8 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// // Frontmatter can be updated in-place, so we can call insert on it directly.
|
||||
/// context.frontmatter.insert(key, value);
|
||||
///
|
||||
/// // Postprocessors must return their (modified) context, the markdown events that make
|
||||
/// // up the note and a next action to take.
|
||||
/// (context, events, PostprocessorResult::Continue)
|
||||
/// // This return value indicates processing should continue.
|
||||
/// PostprocessorResult::Continue
|
||||
/// });
|
||||
///
|
||||
/// exporter.run().unwrap();
|
||||
|
@ -96,18 +116,13 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// # use tempfile::TempDir;
|
||||
/// #
|
||||
/// /// This postprocessor replaces any instance of "foo" with "bar" in the note body.
|
||||
/// fn foo_to_bar(
|
||||
/// context: Context,
|
||||
/// events: MarkdownEvents,
|
||||
/// ) -> (Context, MarkdownEvents, PostprocessorResult) {
|
||||
/// let events = events
|
||||
/// .into_iter()
|
||||
/// .map(|event| match event {
|
||||
/// Event::Text(text) => Event::Text(CowStr::from(text.replace("foo", "bar"))),
|
||||
/// event => event,
|
||||
/// })
|
||||
/// .collect();
|
||||
/// (context, events, PostprocessorResult::Continue)
|
||||
/// fn foo_to_bar(context: &mut Context, events: &mut MarkdownEvents) -> PostprocessorResult {
|
||||
/// for event in events.iter_mut() {
|
||||
/// if let Event::Text(text) = event {
|
||||
/// *event = Event::Text(CowStr::from(text.replace("foo", "bar")))
|
||||
/// }
|
||||
/// }
|
||||
/// PostprocessorResult::Continue
|
||||
/// }
|
||||
///
|
||||
/// # let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
|
@ -118,8 +133,8 @@ pub type MarkdownEvents<'a> = Vec<Event<'a>>;
|
|||
/// # exporter.run().unwrap();
|
||||
/// ```
|
||||
|
||||
pub type Postprocessor =
|
||||
dyn Fn(Context, MarkdownEvents) -> (Context, MarkdownEvents, PostprocessorResult) + Send + Sync;
|
||||
pub type Postprocessor<'f> =
|
||||
dyn Fn(&mut Context, &mut MarkdownEvents) -> PostprocessorResult + Send + Sync + 'f;
|
||||
type Result<T, E = ExportError> = std::result::Result<T, E>;
|
||||
|
||||
const PERCENTENCODE_CHARS: &AsciiSet = &CONTROLS.add(b' ').add(b'(').add(b')').add(b'%').add(b'?');
|
||||
|
@ -190,7 +205,7 @@ pub enum ExportError {
|
|||
},
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq)]
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
/// Emitted by [Postprocessor]s to signal the next action to take.
|
||||
pub enum PostprocessorResult {
|
||||
/// Continue with the next post-processor (if any).
|
||||
|
@ -211,11 +226,13 @@ pub enum PostprocessorResult {
|
|||
pub struct Exporter<'a> {
|
||||
root: PathBuf,
|
||||
destination: PathBuf,
|
||||
start_at: PathBuf,
|
||||
frontmatter_strategy: FrontmatterStrategy,
|
||||
vault_contents: Option<Vec<PathBuf>>,
|
||||
walk_options: WalkOptions<'a>,
|
||||
process_embeds_recursively: bool,
|
||||
postprocessors: Vec<&'a Postprocessor>,
|
||||
postprocessors: Vec<&'a Postprocessor<'a>>,
|
||||
embed_postprocessors: Vec<&'a Postprocessor<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> fmt::Debug for Exporter<'a> {
|
||||
|
@ -234,25 +251,43 @@ impl<'a> fmt::Debug for Exporter<'a> {
|
|||
"postprocessors",
|
||||
&format!("<{} postprocessors active>", self.postprocessors.len()),
|
||||
)
|
||||
.field(
|
||||
"embed_postprocessors",
|
||||
&format!(
|
||||
"<{} postprocessors active>",
|
||||
self.embed_postprocessors.len()
|
||||
),
|
||||
)
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Exporter<'a> {
|
||||
/// Create a new exporter which reads notes from `source` and exports these to
|
||||
/// Create a new exporter which reads notes from `root` and exports these to
|
||||
/// `destination`.
|
||||
pub fn new(source: PathBuf, destination: PathBuf) -> Exporter<'a> {
|
||||
pub fn new(root: PathBuf, destination: PathBuf) -> Exporter<'a> {
|
||||
Exporter {
|
||||
root: source,
|
||||
start_at: root.clone(),
|
||||
root,
|
||||
destination,
|
||||
frontmatter_strategy: FrontmatterStrategy::Auto,
|
||||
walk_options: WalkOptions::default(),
|
||||
process_embeds_recursively: true,
|
||||
vault_contents: None,
|
||||
postprocessors: vec![],
|
||||
embed_postprocessors: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
/// Set a custom starting point for the export.
|
||||
///
|
||||
/// Normally all notes under `root` (except for notes excluded by ignore rules) will be exported.
|
||||
/// When `start_at` is set, only notes under this path will be exported to the target destination.
|
||||
pub fn start_at(&mut self, start_at: PathBuf) -> &mut Exporter<'a> {
|
||||
self.start_at = start_at;
|
||||
self
|
||||
}
|
||||
|
||||
/// Set the [`WalkOptions`] to be used for this exporter.
|
||||
pub fn walk_options(&mut self, options: WalkOptions<'a>) -> &mut Exporter<'a> {
|
||||
self.walk_options = options;
|
||||
|
@ -284,6 +319,12 @@ impl<'a> Exporter<'a> {
|
|||
self
|
||||
}
|
||||
|
||||
/// Append a function to the chain of [postprocessors][Postprocessor] for embeds.
|
||||
pub fn add_embed_postprocessor(&mut self, processor: &'a Postprocessor) -> &mut Exporter<'a> {
|
||||
self.embed_postprocessors.push(processor);
|
||||
self
|
||||
}
|
||||
|
||||
/// Export notes using the settings configured on this exporter.
|
||||
pub fn run(&mut self) -> Result<()> {
|
||||
if !self.root.exists() {
|
||||
|
@ -292,13 +333,17 @@ impl<'a> Exporter<'a> {
|
|||
});
|
||||
}
|
||||
|
||||
// When a single file is specified, we can short-circuit contruction of walk and associated
|
||||
// directory traversal. This also allows us to accept destination as either a file or a
|
||||
// directory name.
|
||||
if self.root.is_file() {
|
||||
self.vault_contents = Some(vec![self.root.clone()]);
|
||||
self.vault_contents = Some(vault_contents(
|
||||
self.root.as_path(),
|
||||
self.walk_options.clone(),
|
||||
)?);
|
||||
|
||||
// When a single file is specified, just need to export that specific file instead of
|
||||
// iterating over all discovered files. This also allows us to accept destination as either
|
||||
// a file or a directory name.
|
||||
if self.root.is_file() || self.start_at.is_file() {
|
||||
let source_filename = self
|
||||
.root
|
||||
.start_at
|
||||
.file_name()
|
||||
.expect("File without a filename? How is that possible?")
|
||||
.to_string_lossy();
|
||||
|
@ -317,7 +362,7 @@ impl<'a> Exporter<'a> {
|
|||
self.destination.clone()
|
||||
}
|
||||
};
|
||||
return Ok(self.export_note(&self.root, &destination)?);
|
||||
return self.export_note(&self.start_at, &destination);
|
||||
}
|
||||
|
||||
if !self.destination.exists() {
|
||||
|
@ -325,22 +370,18 @@ impl<'a> Exporter<'a> {
|
|||
path: self.destination.clone(),
|
||||
});
|
||||
}
|
||||
|
||||
self.vault_contents = Some(vault_contents(
|
||||
self.root.as_path(),
|
||||
self.walk_options.clone(),
|
||||
)?);
|
||||
self.vault_contents
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.clone()
|
||||
.into_par_iter()
|
||||
.filter(|file| file.starts_with(&self.start_at))
|
||||
.try_for_each(|file| {
|
||||
let relative_path = file
|
||||
.strip_prefix(&self.root.clone())
|
||||
.strip_prefix(&self.start_at.clone())
|
||||
.expect("file should always be nested under root")
|
||||
.to_path_buf();
|
||||
let destination = &self.destination.join(&relative_path);
|
||||
let destination = &self.destination.join(relative_path);
|
||||
self.export_note(&file, destination)
|
||||
})?;
|
||||
Ok(())
|
||||
|
@ -351,22 +392,19 @@ impl<'a> Exporter<'a> {
|
|||
true => self.parse_and_export_obsidian_note(src, dest),
|
||||
false => copy_file(src, dest),
|
||||
}
|
||||
.context(FileExportError { path: src })
|
||||
.context(FileExportSnafu { path: src })
|
||||
}
|
||||
|
||||
fn parse_and_export_obsidian_note(&self, src: &Path, dest: &Path) -> Result<()> {
|
||||
let mut context = Context::new(src.to_path_buf(), dest.to_path_buf());
|
||||
|
||||
let (frontmatter, mut markdown_events) = self.parse_obsidian_note(&src, &context)?;
|
||||
let (frontmatter, mut markdown_events) = self.parse_obsidian_note(src, &context)?;
|
||||
context.frontmatter = frontmatter;
|
||||
for func in &self.postprocessors {
|
||||
let res = func(context, markdown_events);
|
||||
context = res.0;
|
||||
markdown_events = res.1;
|
||||
match res.2 {
|
||||
match func(&mut context, &mut markdown_events) {
|
||||
PostprocessorResult::StopHere => break,
|
||||
PostprocessorResult::StopAndSkipNote => return Ok(()),
|
||||
_ => (),
|
||||
PostprocessorResult::Continue => (),
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -379,15 +417,15 @@ impl<'a> Exporter<'a> {
|
|||
};
|
||||
if write_frontmatter {
|
||||
let mut frontmatter_str = frontmatter_to_str(context.frontmatter)
|
||||
.context(FrontMatterEncodeError { path: src })?;
|
||||
.context(FrontMatterEncodeSnafu { path: src })?;
|
||||
frontmatter_str.push('\n');
|
||||
outfile
|
||||
.write_all(frontmatter_str.as_bytes())
|
||||
.context(WriteError { path: &dest })?;
|
||||
.context(WriteSnafu { path: &dest })?;
|
||||
}
|
||||
outfile
|
||||
.write_all(render_mdevents_to_mdtext(markdown_events).as_bytes())
|
||||
.context(WriteError { path: &dest })?;
|
||||
.context(WriteSnafu { path: &dest })?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
@ -401,11 +439,11 @@ impl<'a> Exporter<'a> {
|
|||
file_tree: context.file_tree(),
|
||||
});
|
||||
}
|
||||
let content = fs::read_to_string(&path).context(ReadError { path })?;
|
||||
let content = fs::read_to_string(path).context(ReadSnafu { path })?;
|
||||
let (frontmatter, content) =
|
||||
matter::matter(&content).unwrap_or(("".to_string(), content.to_string()));
|
||||
let frontmatter =
|
||||
frontmatter_from_str(&frontmatter).context(FrontMatterDecodeError { path })?;
|
||||
frontmatter_from_str(&frontmatter).context(FrontMatterDecodeSnafu { path })?;
|
||||
|
||||
let mut parser_options = Options::empty();
|
||||
parser_options.insert(Options::ENABLE_TABLES);
|
||||
|
@ -526,12 +564,12 @@ impl<'a> Exporter<'a> {
|
|||
let note_ref = ObsidianNoteReference::from_str(link_text);
|
||||
|
||||
let path = match note_ref.file {
|
||||
Some(file) => lookup_filename_in_vault(file, &self.vault_contents.as_ref().unwrap()),
|
||||
Some(file) => lookup_filename_in_vault(file, self.vault_contents.as_ref().unwrap()),
|
||||
|
||||
// If we have None file it is either to a section or id within the same file and thus
|
||||
// the current embed logic will fail, recurssing until it reaches it's limit.
|
||||
// For now we just bail early.
|
||||
None => return Ok(self.make_link_to_file(note_ref, &context)),
|
||||
None => return Ok(self.make_link_to_file(note_ref, context)),
|
||||
};
|
||||
|
||||
if path.is_none() {
|
||||
|
@ -547,7 +585,7 @@ impl<'a> Exporter<'a> {
|
|||
}
|
||||
|
||||
let path = path.unwrap();
|
||||
let child_context = Context::from_parent(context, path);
|
||||
let mut child_context = Context::from_parent(context, path);
|
||||
let no_ext = OsString::new();
|
||||
|
||||
if !self.process_embeds_recursively && context.file_tree().contains(path) {
|
||||
|
@ -560,13 +598,25 @@ impl<'a> Exporter<'a> {
|
|||
|
||||
let events = match path.extension().unwrap_or(&no_ext).to_str() {
|
||||
Some("md") => {
|
||||
let (_frontmatter, mut events) = self.parse_obsidian_note(&path, &child_context)?;
|
||||
let (frontmatter, mut events) = self.parse_obsidian_note(path, &child_context)?;
|
||||
child_context.frontmatter = frontmatter;
|
||||
if let Some(section) = note_ref.section {
|
||||
events = reduce_to_section(events, section);
|
||||
}
|
||||
for func in &self.embed_postprocessors {
|
||||
// Postprocessors running on embeds shouldn't be able to change frontmatter (or
|
||||
// any other metadata), so we give them a clone of the context.
|
||||
match func(&mut child_context, &mut events) {
|
||||
PostprocessorResult::StopHere => break,
|
||||
PostprocessorResult::StopAndSkipNote => {
|
||||
events = vec![];
|
||||
}
|
||||
PostprocessorResult::Continue => (),
|
||||
}
|
||||
}
|
||||
events
|
||||
}
|
||||
Some("png") | Some("jpg") | Some("jpeg") | Some("gif") | Some("webp") => {
|
||||
Some("png") | Some("jpg") | Some("jpeg") | Some("gif") | Some("webp") | Some("svg") => {
|
||||
self.make_link_to_file(note_ref, &child_context)
|
||||
.into_iter()
|
||||
.map(|event| match event {
|
||||
|
@ -597,14 +647,14 @@ impl<'a> Exporter<'a> {
|
|||
Ok(events)
|
||||
}
|
||||
|
||||
fn make_link_to_file<'b, 'c>(
|
||||
fn make_link_to_file<'c>(
|
||||
&self,
|
||||
reference: ObsidianNoteReference<'b>,
|
||||
reference: ObsidianNoteReference<'_>,
|
||||
context: &Context,
|
||||
) -> MarkdownEvents<'c> {
|
||||
let target_file = reference
|
||||
.file
|
||||
.map(|file| lookup_filename_in_vault(file, &self.vault_contents.as_ref().unwrap()))
|
||||
.map(|file| lookup_filename_in_vault(file, self.vault_contents.as_ref().unwrap()))
|
||||
.unwrap_or_else(|| Some(context.current_file()));
|
||||
|
||||
if target_file.is_none() {
|
||||
|
@ -628,7 +678,7 @@ impl<'a> Exporter<'a> {
|
|||
// in case of embedded notes.
|
||||
let rel_link = diff_paths(
|
||||
target_file,
|
||||
&context
|
||||
context
|
||||
.root_file()
|
||||
.parent()
|
||||
.expect("obsidian content files should always have a parent"),
|
||||
|
@ -657,22 +707,33 @@ impl<'a> Exporter<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
/// Get the full path for the given filename when it's contained in vault_contents, taking into
|
||||
/// account:
|
||||
///
|
||||
/// 1. Standard Obsidian note references not including a .md extension.
|
||||
/// 2. Case-insensitive matching
|
||||
/// 3. Unicode normalization rules using normalization form C
|
||||
/// (https://www.w3.org/TR/charmod-norm/#unicodeNormalization)
|
||||
fn lookup_filename_in_vault<'a>(
|
||||
filename: &str,
|
||||
vault_contents: &'a [PathBuf],
|
||||
) -> Option<&'a PathBuf> {
|
||||
// Markdown files don't have their .md extension added by Obsidian, but other files (images,
|
||||
// PDFs, etc) do so we match on both possibilities.
|
||||
//
|
||||
// References can also refer to notes in a different case (to lowercase text in a
|
||||
// sentence even if the note is capitalized for example) so we also try a case-insensitive
|
||||
// lookup.
|
||||
let filename = PathBuf::from(filename);
|
||||
let filename_normalized = filename.to_string_lossy().nfc().collect::<String>();
|
||||
|
||||
vault_contents.iter().find(|path| {
|
||||
let path_lowered = PathBuf::from(path.to_string_lossy().to_lowercase());
|
||||
path.ends_with(&filename)
|
||||
|| path_lowered.ends_with(&filename.to_lowercase())
|
||||
|| path.ends_with(format!("{}.md", &filename))
|
||||
|| path_lowered.ends_with(format!("{}.md", &filename.to_lowercase()))
|
||||
let path_normalized_str = path.to_string_lossy().nfc().collect::<String>();
|
||||
let path_normalized = PathBuf::from(&path_normalized_str);
|
||||
let path_normalized_lowered = PathBuf::from(&path_normalized_str.to_lowercase());
|
||||
|
||||
// It would be convenient if we could just do `filename.set_extension("md")` at the start
|
||||
// of this funtion so we don't need multiple separate + ".md" match cases here, however
|
||||
// that would break with a reference of `[[Note.1]]` linking to `[[Note.1.md]]`.
|
||||
|
||||
path_normalized.ends_with(&filename_normalized)
|
||||
|| path_normalized.ends_with(filename_normalized.clone() + ".md")
|
||||
|| path_normalized_lowered.ends_with(filename_normalized.to_lowercase())
|
||||
|| path_normalized_lowered.ends_with(filename_normalized.to_lowercase() + ".md")
|
||||
})
|
||||
}
|
||||
|
||||
|
@ -681,7 +742,6 @@ fn render_mdevents_to_mdtext(markdown: MarkdownEvents) -> String {
|
|||
cmark_with_options(
|
||||
markdown.iter(),
|
||||
&mut buffer,
|
||||
None,
|
||||
pulldown_cmark_to_cmark::Options::default(),
|
||||
)
|
||||
.expect("formatting to string not expected to fail");
|
||||
|
@ -690,32 +750,28 @@ fn render_mdevents_to_mdtext(markdown: MarkdownEvents) -> String {
|
|||
}
|
||||
|
||||
fn create_file(dest: &Path) -> Result<File> {
|
||||
let file = File::create(&dest)
|
||||
let file = File::create(dest)
|
||||
.or_else(|err| {
|
||||
if err.kind() == ErrorKind::NotFound {
|
||||
let parent = dest.parent().expect("file should have a parent directory");
|
||||
if let Err(err) = std::fs::create_dir_all(&parent) {
|
||||
return Err(err);
|
||||
}
|
||||
std::fs::create_dir_all(parent)?
|
||||
}
|
||||
File::create(&dest)
|
||||
File::create(dest)
|
||||
})
|
||||
.context(WriteError { path: dest })?;
|
||||
.context(WriteSnafu { path: dest })?;
|
||||
Ok(file)
|
||||
}
|
||||
|
||||
fn copy_file(src: &Path, dest: &Path) -> Result<()> {
|
||||
std::fs::copy(&src, &dest)
|
||||
std::fs::copy(src, dest)
|
||||
.or_else(|err| {
|
||||
if err.kind() == ErrorKind::NotFound {
|
||||
let parent = dest.parent().expect("file should have a parent directory");
|
||||
if let Err(err) = std::fs::create_dir_all(&parent) {
|
||||
return Err(err);
|
||||
}
|
||||
std::fs::create_dir_all(parent)?
|
||||
}
|
||||
std::fs::copy(&src, &dest)
|
||||
std::fs::copy(src, dest)
|
||||
})
|
||||
.context(WriteError { path: dest })?;
|
||||
.context(WriteSnafu { path: dest })?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
@ -727,18 +783,19 @@ fn is_markdown_file(file: &Path) -> bool {
|
|||
|
||||
/// Reduce a given `MarkdownEvents` to just those elements which are children of the given section
|
||||
/// (heading name).
|
||||
fn reduce_to_section<'a, 'b>(events: MarkdownEvents<'a>, section: &'b str) -> MarkdownEvents<'a> {
|
||||
fn reduce_to_section<'a>(events: MarkdownEvents<'a>, section: &str) -> MarkdownEvents<'a> {
|
||||
let mut filtered_events = Vec::with_capacity(events.len());
|
||||
let mut target_section_encountered = false;
|
||||
let mut currently_in_target_section = false;
|
||||
let mut section_level = 0;
|
||||
let mut last_level = 0;
|
||||
let mut section_level = HeadingLevel::H1;
|
||||
let mut last_level = HeadingLevel::H1;
|
||||
let mut last_tag_was_heading = false;
|
||||
|
||||
for event in events.into_iter() {
|
||||
filtered_events.push(event.clone());
|
||||
match event {
|
||||
Event::Start(Tag::Heading(level)) => {
|
||||
// FIXME: This should propagate fragment_identifier and classes.
|
||||
Event::Start(Tag::Heading(level, _fragment_identifier, _classes)) => {
|
||||
last_tag_was_heading = true;
|
||||
last_level = level;
|
||||
if currently_in_target_section && level <= section_level {
|
||||
|
@ -794,7 +851,10 @@ fn event_to_owned<'a>(event: Event) -> Event<'a> {
|
|||
fn tag_to_owned<'a>(tag: Tag) -> Tag<'a> {
|
||||
match tag {
|
||||
Tag::Paragraph => Tag::Paragraph,
|
||||
Tag::Heading(level) => Tag::Heading(level),
|
||||
Tag::Heading(level, _fragment_identifier, _classes) => {
|
||||
// FIXME: This should propagate fragment_identifier and classes.
|
||||
Tag::Heading(level, None, Vec::new())
|
||||
}
|
||||
Tag::BlockQuote => Tag::BlockQuote,
|
||||
Tag::CodeBlock(codeblock_kind) => Tag::CodeBlock(codeblock_kind_to_owned(codeblock_kind)),
|
||||
Tag::List(optional) => Tag::List(optional),
|
||||
|
@ -828,3 +888,77 @@ fn codeblock_kind_to_owned<'a>(codeblock_kind: CodeBlockKind) -> CodeBlockKind<'
|
|||
CodeBlockKind::Fenced(cowstr) => CodeBlockKind::Fenced(CowStr::from(cowstr.into_string())),
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
use rstest::rstest;
|
||||
|
||||
lazy_static! {
|
||||
static ref VAULT: Vec<std::path::PathBuf> = vec![
|
||||
PathBuf::from("NoteA.md"),
|
||||
PathBuf::from("Document.pdf"),
|
||||
PathBuf::from("Note.1.md"),
|
||||
PathBuf::from("nested/NoteA.md"),
|
||||
PathBuf::from("Note\u{E4}.md"), // Noteä.md, see also encodings() below
|
||||
];
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn encodings() {
|
||||
// Standard "Latin Small Letter A with Diaeresis" (U+00E4)
|
||||
// Encoded in UTF-8 as two bytes: 0xC3 0xA4
|
||||
assert_eq!(String::from_utf8(vec![0xC3, 0xA4]).unwrap(), "ä");
|
||||
assert_eq!("\u{E4}", "ä");
|
||||
|
||||
// Basic (ASCII) lowercase a followed by Unicode Character “◌̈” (U+0308)
|
||||
// Renders the same visual appearance but is encoded in UTF-8 as three bytes:
|
||||
// 0x61 0xCC 0x88
|
||||
assert_eq!(String::from_utf8(vec![0x61, 0xCC, 0x88]).unwrap(), "ä");
|
||||
assert_eq!("a\u{308}", "ä");
|
||||
assert_eq!("\u{61}\u{308}", "ä");
|
||||
|
||||
// For more examples and a better explanation of this concept, see
|
||||
// https://www.w3.org/TR/charmod-norm/#aringExample
|
||||
}
|
||||
|
||||
#[rstest]
|
||||
// Exact match
|
||||
#[case("NoteA.md", "NoteA.md")]
|
||||
#[case("NoteA", "NoteA.md")]
|
||||
// Same note in subdir, exact match should find it
|
||||
#[case("nested/NoteA.md", "nested/NoteA.md")]
|
||||
#[case("nested/NoteA", "nested/NoteA.md")]
|
||||
// Different extensions
|
||||
#[case("Document.pdf", "Document.pdf")]
|
||||
#[case("Note.1", "Note.1.md")]
|
||||
#[case("Note.1.md", "Note.1.md")]
|
||||
// Case-insensitive matches
|
||||
#[case("notea.md", "NoteA.md")]
|
||||
#[case("notea", "NoteA.md")]
|
||||
#[case("NESTED/notea.md", "nested/NoteA.md")]
|
||||
#[case("NESTED/notea", "nested/NoteA.md")]
|
||||
// "Latin Small Letter A with Diaeresis" (U+00E4)
|
||||
#[case("Note\u{E4}.md", "Note\u{E4}.md")]
|
||||
#[case("Note\u{E4}", "Note\u{E4}.md")]
|
||||
// Basic (ASCII) lowercase a followed by Unicode Character “◌̈” (U+0308)
|
||||
// The UTF-8 encoding is different but it renders the same visual appearance as the case above,
|
||||
// so we expect it to find the same file.
|
||||
#[case("Note\u{61}\u{308}.md", "Note\u{E4}.md")]
|
||||
#[case("Note\u{61}\u{308}", "Note\u{E4}.md")]
|
||||
// We should expect this to work with lowercasing as well, so NoteÄ should find Noteä
|
||||
// NoteÄ where Ä = Single Ä (U+00C4)
|
||||
#[case("Note\u{C4}.md", "Note\u{E4}.md")]
|
||||
#[case("Note\u{C4}", "Note\u{E4}.md")]
|
||||
// NoteÄ where Ä = decomposed to A (U+0041) + ◌̈ (U+0308)
|
||||
#[case("Note\u{41}\u{308}.md", "Note\u{E4}.md")]
|
||||
#[case("Note\u{41}\u{308}", "Note\u{E4}.md")]
|
||||
fn test_lookup_filename_in_vault(#[case] input: &str, #[case] expected: &str) {
|
||||
let result = lookup_filename_in_vault(input, &VAULT);
|
||||
println!("Test input: {:?}", input);
|
||||
println!("Expecting: {:?}", expected);
|
||||
println!("Got: {:?}", result.unwrap_or(&PathBuf::from("")));
|
||||
assert_eq!(result, Some(&PathBuf::from(expected)))
|
||||
}
|
||||
}
|
||||
|
|
40
src/main.rs
40
src/main.rs
|
@ -1,9 +1,10 @@
|
|||
use eyre::{eyre, Result};
|
||||
use gumdrop::Options;
|
||||
use obsidian_export::{ExportError, Exporter, FrontmatterStrategy, WalkOptions};
|
||||
use obsidian_export::{postprocessors::*, ExportError};
|
||||
use obsidian_export::{Exporter, FrontmatterStrategy, WalkOptions};
|
||||
use std::{env, path::PathBuf};
|
||||
|
||||
const VERSION: &'static str = env!("CARGO_PKG_VERSION");
|
||||
const VERSION: &str = env!("CARGO_PKG_VERSION");
|
||||
|
||||
#[derive(Debug, Options)]
|
||||
struct Opts {
|
||||
|
@ -13,12 +14,15 @@ struct Opts {
|
|||
#[options(help = "Display version information")]
|
||||
version: bool,
|
||||
|
||||
#[options(help = "Source file containing reference", free, required)]
|
||||
#[options(help = "Read notes from this source", free, required)]
|
||||
source: Option<PathBuf>,
|
||||
|
||||
#[options(help = "Destination file being linked to", free, required)]
|
||||
#[options(help = "Write notes to this destination", free, required)]
|
||||
destination: Option<PathBuf>,
|
||||
|
||||
#[options(no_short, help = "Only export notes under this sub-path")]
|
||||
start_at: Option<PathBuf>,
|
||||
|
||||
#[options(
|
||||
help = "Frontmatter strategy (one of: always, never, auto)",
|
||||
no_short,
|
||||
|
@ -35,6 +39,12 @@ struct Opts {
|
|||
)]
|
||||
ignore_file: String,
|
||||
|
||||
#[options(no_short, help = "Exclude files with this tag from the export")]
|
||||
skip_tags: Vec<String>,
|
||||
|
||||
#[options(no_short, help = "Export only files with this tag")]
|
||||
only_tags: Vec<String>,
|
||||
|
||||
#[options(no_short, help = "Export hidden files", default = "false")]
|
||||
hidden: bool,
|
||||
|
||||
|
@ -43,6 +53,13 @@ struct Opts {
|
|||
|
||||
#[options(no_short, help = "Don't process embeds recursively", default = "false")]
|
||||
no_recursive_embeds: bool,
|
||||
|
||||
#[options(
|
||||
no_short,
|
||||
help = "Convert soft line breaks to hard line breaks. This mimics Obsidian's 'Strict line breaks' setting",
|
||||
default = "false"
|
||||
)]
|
||||
hard_linebreaks: bool,
|
||||
}
|
||||
|
||||
fn frontmatter_strategy_from_str(input: &str) -> Result<FrontmatterStrategy> {
|
||||
|
@ -64,7 +81,7 @@ fn main() {
|
|||
}
|
||||
|
||||
let args = Opts::parse_args_default_or_exit();
|
||||
let source = args.source.unwrap();
|
||||
let root = args.source.unwrap();
|
||||
let destination = args.destination.unwrap();
|
||||
|
||||
let walk_options = WalkOptions {
|
||||
|
@ -74,11 +91,22 @@ fn main() {
|
|||
..Default::default()
|
||||
};
|
||||
|
||||
let mut exporter = Exporter::new(source, destination);
|
||||
let mut exporter = Exporter::new(root, destination);
|
||||
exporter.frontmatter_strategy(args.frontmatter_strategy);
|
||||
exporter.process_embeds_recursively(!args.no_recursive_embeds);
|
||||
exporter.walk_options(walk_options);
|
||||
|
||||
if args.hard_linebreaks {
|
||||
exporter.add_postprocessor(&softbreaks_to_hardbreaks);
|
||||
}
|
||||
|
||||
let tags_postprocessor = filter_by_tags(args.skip_tags, args.only_tags);
|
||||
exporter.add_postprocessor(&tags_postprocessor);
|
||||
|
||||
if let Some(path) = args.start_at {
|
||||
exporter.start_at(path);
|
||||
}
|
||||
|
||||
if let Err(err) = exporter.run() {
|
||||
match err {
|
||||
ExportError::FileExportError {
|
||||
|
|
|
@ -0,0 +1,106 @@
|
|||
//! A collection of officially maintained [postprocessors][crate::Postprocessor].
|
||||
|
||||
use super::{Context, MarkdownEvents, PostprocessorResult};
|
||||
use pulldown_cmark::Event;
|
||||
use serde_yaml::Value;
|
||||
|
||||
/// This postprocessor converts all soft line breaks to hard line breaks. Enabling this mimics
|
||||
/// Obsidian's _'Strict line breaks'_ setting.
|
||||
pub fn softbreaks_to_hardbreaks(
|
||||
_context: &mut Context,
|
||||
events: &mut MarkdownEvents,
|
||||
) -> PostprocessorResult {
|
||||
for event in events.iter_mut() {
|
||||
if event == &Event::SoftBreak {
|
||||
*event = Event::HardBreak;
|
||||
}
|
||||
}
|
||||
PostprocessorResult::Continue
|
||||
}
|
||||
|
||||
pub fn filter_by_tags(
|
||||
skip_tags: Vec<String>,
|
||||
only_tags: Vec<String>,
|
||||
) -> impl Fn(&mut Context, &mut MarkdownEvents) -> PostprocessorResult {
|
||||
move |context: &mut Context, _events: &mut MarkdownEvents| -> PostprocessorResult {
|
||||
match context.frontmatter.get("tags") {
|
||||
None => filter_by_tags_(&[], &skip_tags, &only_tags),
|
||||
Some(Value::Sequence(tags)) => filter_by_tags_(tags, &skip_tags, &only_tags),
|
||||
_ => PostprocessorResult::Continue,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn filter_by_tags_(
|
||||
tags: &[Value],
|
||||
skip_tags: &[String],
|
||||
only_tags: &[String],
|
||||
) -> PostprocessorResult {
|
||||
let skip = skip_tags
|
||||
.iter()
|
||||
.any(|tag| tags.contains(&Value::String(tag.to_string())));
|
||||
let include = only_tags.is_empty()
|
||||
|| only_tags
|
||||
.iter()
|
||||
.any(|tag| tags.contains(&Value::String(tag.to_string())));
|
||||
|
||||
if skip || !include {
|
||||
PostprocessorResult::StopAndSkipNote
|
||||
} else {
|
||||
PostprocessorResult::Continue
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_filter_tags() {
|
||||
let tags = vec![
|
||||
Value::String("skip".to_string()),
|
||||
Value::String("publish".to_string()),
|
||||
];
|
||||
let empty_tags = vec![];
|
||||
assert_eq!(
|
||||
filter_by_tags_(&empty_tags, &[], &[]),
|
||||
PostprocessorResult::Continue,
|
||||
"When no exclusion & inclusion are specified, files without tags are included"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &[], &[]),
|
||||
PostprocessorResult::Continue,
|
||||
"When no exclusion & inclusion are specified, files with tags are included"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &["exclude".to_string()], &[]),
|
||||
PostprocessorResult::Continue,
|
||||
"When exclusion tags don't match files with tags are included"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&empty_tags, &["exclude".to_string()], &[]),
|
||||
PostprocessorResult::Continue,
|
||||
"When exclusion tags don't match files without tags are included"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &[], &["publish".to_string()]),
|
||||
PostprocessorResult::Continue,
|
||||
"When exclusion tags don't match files with tags are included"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&empty_tags, &[], &["include".to_string()]),
|
||||
PostprocessorResult::StopAndSkipNote,
|
||||
"When inclusion tags are specified files without tags are excluded"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &[], &["include".to_string()]),
|
||||
PostprocessorResult::StopAndSkipNote,
|
||||
"When exclusion tags don't match files with tags are exluded"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &["skip".to_string()], &["skip".to_string()]),
|
||||
PostprocessorResult::StopAndSkipNote,
|
||||
"When both inclusion and exclusion tags are the same exclusion wins"
|
||||
);
|
||||
assert_eq!(
|
||||
filter_by_tags_(&tags, &["skip".to_string()], &["publish".to_string()]),
|
||||
PostprocessorResult::StopAndSkipNote,
|
||||
"When both inclusion and exclusion tags match exclusion wins"
|
||||
);
|
||||
}
|
|
@ -6,7 +6,7 @@ lazy_static! {
|
|||
Regex::new(r"^(?P<file>[^#|]+)??(#(?P<section>.+?))??(\|(?P<label>.+?))??$").unwrap();
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
/// ObsidianNoteReference represents the structure of a `[[note]]` or `![[embed]]` reference.
|
||||
pub struct ObsidianNoteReference<'a> {
|
||||
/// The file (note name or partial path) being referenced.
|
||||
|
@ -18,7 +18,7 @@ pub struct ObsidianNoteReference<'a> {
|
|||
pub label: Option<&'a str>,
|
||||
}
|
||||
|
||||
#[derive(PartialEq)]
|
||||
#[derive(PartialEq, Eq)]
|
||||
/// RefParserState enumerates all the possible parsing states [RefParser] may enter.
|
||||
pub enum RefParserState {
|
||||
NoState,
|
||||
|
@ -71,16 +71,16 @@ impl RefParser {
|
|||
impl<'a> ObsidianNoteReference<'a> {
|
||||
pub fn from_str(text: &str) -> ObsidianNoteReference {
|
||||
let captures = OBSIDIAN_NOTE_LINK_RE
|
||||
.captures(&text)
|
||||
.captures(text)
|
||||
.expect("note link regex didn't match - bad input?");
|
||||
let file = captures.name("file").map(|v| v.as_str());
|
||||
let file = captures.name("file").map(|v| v.as_str().trim());
|
||||
let label = captures.name("label").map(|v| v.as_str());
|
||||
let section = captures.name("section").map(|v| v.as_str());
|
||||
let section = captures.name("section").map(|v| v.as_str().trim());
|
||||
|
||||
ObsidianNoteReference {
|
||||
file,
|
||||
label,
|
||||
section,
|
||||
label,
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
use crate::{ExportError, WalkDirError};
|
||||
use crate::{ExportError, WalkDirSnafu};
|
||||
use ignore::{DirEntry, Walk, WalkBuilder};
|
||||
use snafu::ResultExt;
|
||||
use std::fmt;
|
||||
|
@ -88,9 +88,9 @@ pub fn vault_contents(path: &Path, opts: WalkOptions) -> Result<Vec<PathBuf>> {
|
|||
let mut contents = Vec::new();
|
||||
let walker = opts.build_walker(path);
|
||||
for entry in walker {
|
||||
let entry = entry.context(WalkDirError { path })?;
|
||||
let entry = entry.context(WalkDirSnafu { path })?;
|
||||
let path = entry.path();
|
||||
let metadata = entry.metadata().context(WalkDirError { path })?;
|
||||
let metadata = entry.metadata().context(WalkDirSnafu { path })?;
|
||||
|
||||
if metadata.is_dir() {
|
||||
continue;
|
||||
|
|
|
@ -1,10 +1,6 @@
|
|||
use obsidian_export::{
|
||||
Context, ExportError, Exporter, FrontmatterStrategy, MarkdownEvents, PostprocessorResult,
|
||||
};
|
||||
use obsidian_export::{ExportError, Exporter, FrontmatterStrategy};
|
||||
use pretty_assertions::assert_eq;
|
||||
use pulldown_cmark::{CowStr, Event};
|
||||
use serde_yaml::Value;
|
||||
use std::fs::{create_dir, read_to_string, remove_file, set_permissions, File, Permissions};
|
||||
use std::fs::{create_dir, read_to_string, set_permissions, File, Permissions};
|
||||
use std::io::prelude::*;
|
||||
use std::path::PathBuf;
|
||||
use tempfile::TempDir;
|
||||
|
@ -35,13 +31,14 @@ fn test_main_variants_with_default_options() {
|
|||
continue;
|
||||
};
|
||||
let filename = entry.file_name().to_string_lossy().into_owned();
|
||||
let expected = read_to_string(entry.path()).expect(&format!(
|
||||
"failed to read {} from testdata/expected/main-samples/",
|
||||
entry.path().display()
|
||||
));
|
||||
let actual = read_to_string(tmp_dir.path().clone().join(PathBuf::from(&filename))).expect(
|
||||
&format!("failed to read {} from temporary exportdir", filename),
|
||||
);
|
||||
let expected = read_to_string(entry.path()).unwrap_or_else(|_| {
|
||||
panic!(
|
||||
"failed to read {} from testdata/expected/main-samples/",
|
||||
entry.path().display()
|
||||
)
|
||||
});
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from(&filename)))
|
||||
.unwrap_or_else(|_| panic!("failed to read {} from temporary exportdir", filename));
|
||||
|
||||
assert_eq!(
|
||||
expected, actual,
|
||||
|
@ -65,7 +62,6 @@ fn test_frontmatter_never() {
|
|||
let actual = read_to_string(
|
||||
tmp_dir
|
||||
.path()
|
||||
.clone()
|
||||
.join(PathBuf::from("note-with-frontmatter.md")),
|
||||
)
|
||||
.unwrap();
|
||||
|
@ -88,7 +84,6 @@ fn test_frontmatter_always() {
|
|||
let actual = read_to_string(
|
||||
tmp_dir
|
||||
.path()
|
||||
.clone()
|
||||
.join(PathBuf::from("note-without-frontmatter.md")),
|
||||
)
|
||||
.unwrap();
|
||||
|
@ -99,7 +94,6 @@ fn test_frontmatter_always() {
|
|||
let actual = read_to_string(
|
||||
tmp_dir
|
||||
.path()
|
||||
.clone()
|
||||
.join(PathBuf::from("note-with-frontmatter.md")),
|
||||
)
|
||||
.unwrap();
|
||||
|
@ -117,10 +111,7 @@ fn test_exclude() {
|
|||
.run()
|
||||
.expect("exporter returned error");
|
||||
|
||||
let excluded_note = tmp_dir
|
||||
.path()
|
||||
.clone()
|
||||
.join(PathBuf::from("excluded-note.md"));
|
||||
let excluded_note = tmp_dir.path().join(PathBuf::from("excluded-note.md"));
|
||||
assert!(
|
||||
!excluded_note.exists(),
|
||||
"exluded-note.md was found in tmpdir, but should be absent due to .export-ignore rules"
|
||||
|
@ -139,14 +130,14 @@ fn test_single_file_to_dir() {
|
|||
|
||||
assert_eq!(
|
||||
read_to_string("tests/testdata/expected/single-file/note.md").unwrap(),
|
||||
read_to_string(tmp_dir.path().clone().join(PathBuf::from("note.md"))).unwrap(),
|
||||
read_to_string(tmp_dir.path().join(PathBuf::from("note.md"))).unwrap(),
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_single_file_to_file() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let dest = tmp_dir.path().clone().join(PathBuf::from("export.md"));
|
||||
let dest = tmp_dir.path().join(PathBuf::from("export.md"));
|
||||
|
||||
Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/single-file/note.md"),
|
||||
|
@ -161,6 +152,79 @@ fn test_single_file_to_file() {
|
|||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_start_at_subdir() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/start-at/"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.start_at(PathBuf::from("tests/testdata/input/start-at/subdir"));
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = if cfg!(windows) {
|
||||
read_to_string("tests/testdata/expected/start-at/subdir/Note B.md")
|
||||
.unwrap()
|
||||
.replace('/', "\\")
|
||||
} else {
|
||||
read_to_string("tests/testdata/expected/start-at/subdir/Note B.md").unwrap()
|
||||
};
|
||||
|
||||
assert_eq!(
|
||||
expected,
|
||||
read_to_string(tmp_dir.path().join(PathBuf::from("Note B.md"))).unwrap(),
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_start_at_file_within_subdir_destination_is_dir() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/start-at/"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.start_at(PathBuf::from(
|
||||
"tests/testdata/input/start-at/subdir/Note B.md",
|
||||
));
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = if cfg!(windows) {
|
||||
read_to_string("tests/testdata/expected/start-at/single-file/Note B.md")
|
||||
.unwrap()
|
||||
.replace('/', "\\")
|
||||
} else {
|
||||
read_to_string("tests/testdata/expected/start-at/single-file/Note B.md").unwrap()
|
||||
};
|
||||
|
||||
assert_eq!(
|
||||
expected,
|
||||
read_to_string(tmp_dir.path().join(PathBuf::from("Note B.md"))).unwrap(),
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_start_at_file_within_subdir_destination_is_file() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let dest = tmp_dir.path().join(PathBuf::from("note.md"));
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/start-at/"),
|
||||
dest.clone(),
|
||||
);
|
||||
exporter.start_at(PathBuf::from(
|
||||
"tests/testdata/input/start-at/subdir/Note B.md",
|
||||
));
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = if cfg!(windows) {
|
||||
read_to_string("tests/testdata/expected/start-at/single-file/Note B.md")
|
||||
.unwrap()
|
||||
.replace('/', "\\")
|
||||
} else {
|
||||
read_to_string("tests/testdata/expected/start-at/single-file/Note B.md").unwrap()
|
||||
};
|
||||
assert_eq!(expected, read_to_string(dest).unwrap(),);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_not_existing_source() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
|
@ -290,7 +354,7 @@ fn test_no_recursive_embeds() {
|
|||
|
||||
assert_eq!(
|
||||
read_to_string("tests/testdata/expected/infinite-recursion/Note A.md").unwrap(),
|
||||
read_to_string(tmp_dir.path().clone().join(PathBuf::from("Note A.md"))).unwrap(),
|
||||
read_to_string(tmp_dir.path().join(PathBuf::from("Note A.md"))).unwrap(),
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -316,13 +380,14 @@ fn test_non_ascii_filenames() {
|
|||
continue;
|
||||
};
|
||||
let filename = entry.file_name().to_string_lossy().into_owned();
|
||||
let expected = read_to_string(entry.path()).expect(&format!(
|
||||
"failed to read {} from testdata/expected/non-ascii/",
|
||||
entry.path().display()
|
||||
));
|
||||
let actual = read_to_string(tmp_dir.path().clone().join(PathBuf::from(&filename))).expect(
|
||||
&format!("failed to read {} from temporary exportdir", filename),
|
||||
);
|
||||
let expected = read_to_string(entry.path()).unwrap_or_else(|_| {
|
||||
panic!(
|
||||
"failed to read {} from testdata/expected/non-ascii/",
|
||||
entry.path().display()
|
||||
)
|
||||
});
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from(&filename)))
|
||||
.unwrap_or_else(|_| panic!("failed to read {} from temporary exportdir", filename));
|
||||
|
||||
assert_eq!(
|
||||
expected, actual,
|
||||
|
@ -345,117 +410,12 @@ fn test_same_filename_different_directories() {
|
|||
let expected = if cfg!(windows) {
|
||||
read_to_string("tests/testdata/expected/same-filename-different-directories/Note.md")
|
||||
.unwrap()
|
||||
.replace("/", "\\")
|
||||
.replace('/', "\\")
|
||||
} else {
|
||||
read_to_string("tests/testdata/expected/same-filename-different-directories/Note.md")
|
||||
.unwrap()
|
||||
};
|
||||
|
||||
let actual = read_to_string(tmp_dir.path().clone().join(PathBuf::from("Note.md"))).unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from("Note.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
/// This postprocessor replaces any instance of "foo" with "bar" in the note body.
|
||||
fn foo_to_bar(
|
||||
ctx: Context,
|
||||
events: MarkdownEvents,
|
||||
) -> (Context, MarkdownEvents, PostprocessorResult) {
|
||||
let events = events
|
||||
.into_iter()
|
||||
.map(|event| match event {
|
||||
Event::Text(text) => Event::Text(CowStr::from(text.replace("foo", "bar"))),
|
||||
event => event,
|
||||
})
|
||||
.collect();
|
||||
(ctx, events, PostprocessorResult::Continue)
|
||||
}
|
||||
|
||||
/// This postprocessor appends "bar: baz" to frontmatter.
|
||||
fn append_frontmatter(
|
||||
mut ctx: Context,
|
||||
events: MarkdownEvents,
|
||||
) -> (Context, MarkdownEvents, PostprocessorResult) {
|
||||
ctx.frontmatter.insert(
|
||||
Value::String("bar".to_string()),
|
||||
Value::String("baz".to_string()),
|
||||
);
|
||||
(ctx, events, PostprocessorResult::Continue)
|
||||
}
|
||||
|
||||
// The purpose of this test to verify the `append_frontmatter` postprocessor is called to extend
|
||||
// the frontmatter, and the `foo_to_bar` postprocessor is called to replace instances of "foo" with
|
||||
// "bar" (only in the note body).
|
||||
#[test]
|
||||
fn test_postprocessors() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.add_postprocessor(&foo_to_bar);
|
||||
exporter.add_postprocessor(&append_frontmatter);
|
||||
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = read_to_string("tests/testdata/expected/postprocessors/Note.md").unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().clone().join(PathBuf::from("Note.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_stophere() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
|
||||
exporter.add_postprocessor(&|ctx, mdevents| (ctx, mdevents, PostprocessorResult::StopHere));
|
||||
exporter.add_postprocessor(&|_, _| panic!("should not be called due to above processor"));
|
||||
exporter.run().unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_stop_and_skip() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let note_path = tmp_dir.path().clone().join(PathBuf::from("Note.md"));
|
||||
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(note_path.exists());
|
||||
remove_file(¬e_path).unwrap();
|
||||
|
||||
exporter
|
||||
.add_postprocessor(&|ctx, mdevents| (ctx, mdevents, PostprocessorResult::StopAndSkipNote));
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(!note_path.exists());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_change_destination() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let original_note_path = tmp_dir.path().clone().join(PathBuf::from("Note.md"));
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(original_note_path.exists());
|
||||
remove_file(&original_note_path).unwrap();
|
||||
|
||||
exporter.add_postprocessor(&|mut ctx, mdevents| {
|
||||
ctx.destination.set_file_name("MovedNote.md");
|
||||
(ctx, mdevents, PostprocessorResult::Continue)
|
||||
});
|
||||
exporter.run().unwrap();
|
||||
|
||||
let new_note_path = tmp_dir.path().clone().join(PathBuf::from("MovedNote.md"));
|
||||
assert!(!original_note_path.exists());
|
||||
assert!(new_note_path.exists());
|
||||
}
|
||||
|
|
|
@ -0,0 +1,292 @@
|
|||
use obsidian_export::postprocessors::{filter_by_tags, softbreaks_to_hardbreaks};
|
||||
use obsidian_export::{Context, Exporter, MarkdownEvents, PostprocessorResult};
|
||||
use pretty_assertions::assert_eq;
|
||||
use pulldown_cmark::{CowStr, Event};
|
||||
use serde_yaml::Value;
|
||||
use std::collections::HashSet;
|
||||
use std::fs::{read_to_string, remove_file};
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Mutex;
|
||||
use tempfile::TempDir;
|
||||
use walkdir::WalkDir;
|
||||
|
||||
/// This postprocessor replaces any instance of "foo" with "bar" in the note body.
|
||||
fn foo_to_bar(_ctx: &mut Context, events: &mut MarkdownEvents) -> PostprocessorResult {
|
||||
for event in events.iter_mut() {
|
||||
if let Event::Text(text) = event {
|
||||
*event = Event::Text(CowStr::from(text.replace("foo", "bar")))
|
||||
}
|
||||
}
|
||||
PostprocessorResult::Continue
|
||||
}
|
||||
|
||||
/// This postprocessor appends "bar: baz" to frontmatter.
|
||||
fn append_frontmatter(ctx: &mut Context, _events: &mut MarkdownEvents) -> PostprocessorResult {
|
||||
ctx.frontmatter.insert(
|
||||
Value::String("bar".to_string()),
|
||||
Value::String("baz".to_string()),
|
||||
);
|
||||
PostprocessorResult::Continue
|
||||
}
|
||||
|
||||
// The purpose of this test to verify the `append_frontmatter` postprocessor is called to extend
|
||||
// the frontmatter, and the `foo_to_bar` postprocessor is called to replace instances of "foo" with
|
||||
// "bar" (only in the note body).
|
||||
#[test]
|
||||
fn test_postprocessors() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.add_postprocessor(&foo_to_bar);
|
||||
exporter.add_postprocessor(&append_frontmatter);
|
||||
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = read_to_string("tests/testdata/expected/postprocessors/Note.md").unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from("Note.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_stophere() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
|
||||
exporter.add_postprocessor(&|_ctx, _mdevents| PostprocessorResult::StopHere);
|
||||
exporter.add_embed_postprocessor(&|_ctx, _mdevents| PostprocessorResult::StopHere);
|
||||
exporter.add_postprocessor(&|_, _| panic!("should not be called due to above processor"));
|
||||
exporter.add_embed_postprocessor(&|_, _| panic!("should not be called due to above processor"));
|
||||
exporter.run().unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_stop_and_skip() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let note_path = tmp_dir.path().join(PathBuf::from("Note.md"));
|
||||
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(note_path.exists());
|
||||
remove_file(¬e_path).unwrap();
|
||||
|
||||
exporter.add_postprocessor(&|_ctx, _mdevents| PostprocessorResult::StopAndSkipNote);
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(!note_path.exists());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postprocessor_change_destination() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let original_note_path = tmp_dir.path().join(PathBuf::from("Note.md"));
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.run().unwrap();
|
||||
|
||||
assert!(original_note_path.exists());
|
||||
remove_file(&original_note_path).unwrap();
|
||||
|
||||
exporter.add_postprocessor(&|ctx, _mdevents| {
|
||||
ctx.destination.set_file_name("MovedNote.md");
|
||||
PostprocessorResult::Continue
|
||||
});
|
||||
exporter.run().unwrap();
|
||||
|
||||
let new_note_path = tmp_dir.path().join(PathBuf::from("MovedNote.md"));
|
||||
assert!(!original_note_path.exists());
|
||||
assert!(new_note_path.exists());
|
||||
}
|
||||
|
||||
// Ensure postprocessor type definition has proper lifetimes to allow state (here: `parents`)
|
||||
// to be passed in. Otherwise, this fails with an error like:
|
||||
// error[E0597]: `parents` does not live long enough
|
||||
// cast requires that `parents` is borrowed for `'static`
|
||||
#[test]
|
||||
fn test_postprocessor_stateful_callback() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
|
||||
let parents: Mutex<HashSet<PathBuf>> = Default::default();
|
||||
let callback = |ctx: &mut Context, _mdevents: &mut MarkdownEvents| -> PostprocessorResult {
|
||||
parents
|
||||
.lock()
|
||||
.unwrap()
|
||||
.insert(ctx.destination.parent().unwrap().to_path_buf());
|
||||
PostprocessorResult::Continue
|
||||
};
|
||||
exporter.add_postprocessor(&callback);
|
||||
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected = tmp_dir.path();
|
||||
|
||||
let parents = parents.lock().unwrap();
|
||||
println!("{:?}", parents);
|
||||
assert_eq!(1, parents.len());
|
||||
assert!(parents.contains(expected));
|
||||
}
|
||||
|
||||
// The purpose of this test to verify the `append_frontmatter` postprocessor is called to extend
|
||||
// the frontmatter, and the `foo_to_bar` postprocessor is called to replace instances of "foo" with
|
||||
// "bar" (only in the note body).
|
||||
#[test]
|
||||
fn test_embed_postprocessors() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.add_embed_postprocessor(&foo_to_bar);
|
||||
// Should have no effect with embeds:
|
||||
exporter.add_embed_postprocessor(&append_frontmatter);
|
||||
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected =
|
||||
read_to_string("tests/testdata/expected/postprocessors/Note_embed_postprocess_only.md")
|
||||
.unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from("Note.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
// When StopAndSkipNote is used with an embed_preprocessor, it should skip the embedded note but
|
||||
// continue with the rest of the note.
|
||||
#[test]
|
||||
fn test_embed_postprocessors_stop_and_skip() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.add_embed_postprocessor(&|_ctx, _mdevents| PostprocessorResult::StopAndSkipNote);
|
||||
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected =
|
||||
read_to_string("tests/testdata/expected/postprocessors/Note_embed_stop_and_skip.md")
|
||||
.unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from("Note.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
// This test verifies that the context which is passed to an embed postprocessor is actually
|
||||
// correct. Primarily, this means the frontmatter should reflect that of the note being embedded as
|
||||
// opposed to the frontmatter of the root note.
|
||||
#[test]
|
||||
fn test_embed_postprocessors_context() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
|
||||
exporter.add_postprocessor(&|ctx, _mdevents| {
|
||||
if ctx.current_file() != &PathBuf::from("Note.md") {
|
||||
return PostprocessorResult::Continue;
|
||||
}
|
||||
let is_root_note = ctx
|
||||
.frontmatter
|
||||
.get(&Value::String("is_root_note".to_string()))
|
||||
.unwrap();
|
||||
if is_root_note != &Value::Bool(true) {
|
||||
// NOTE: Test failure may not give output consistently because the test binary affects
|
||||
// how output is captured and printed in the thread running this postprocessor. Just
|
||||
// run the test a couple times until the error shows up.
|
||||
panic!(
|
||||
"postprocessor: expected is_root_note in {} to be true, got false",
|
||||
&ctx.current_file().display()
|
||||
)
|
||||
}
|
||||
PostprocessorResult::Continue
|
||||
});
|
||||
exporter.add_embed_postprocessor(&|ctx, _mdevents| {
|
||||
let is_root_note = ctx
|
||||
.frontmatter
|
||||
.get(&Value::String("is_root_note".to_string()))
|
||||
.unwrap();
|
||||
if is_root_note == &Value::Bool(true) {
|
||||
// NOTE: Test failure may not give output consistently because the test binary affects
|
||||
// how output is captured and printed in the thread running this postprocessor. Just
|
||||
// run the test a couple times until the error shows up.
|
||||
panic!(
|
||||
"embed_postprocessor: expected is_root_note in {} to be false, got true",
|
||||
&ctx.current_file().display()
|
||||
)
|
||||
}
|
||||
PostprocessorResult::Continue
|
||||
});
|
||||
|
||||
exporter.run().unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_softbreaks_to_hardbreaks() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/postprocessors"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
exporter.add_postprocessor(&softbreaks_to_hardbreaks);
|
||||
exporter.run().unwrap();
|
||||
|
||||
let expected =
|
||||
read_to_string("tests/testdata/expected/postprocessors/hard_linebreaks.md").unwrap();
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from("hard_linebreaks.md"))).unwrap();
|
||||
assert_eq!(expected, actual);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_filter_by_tags() {
|
||||
let tmp_dir = TempDir::new().expect("failed to make tempdir");
|
||||
let mut exporter = Exporter::new(
|
||||
PathBuf::from("tests/testdata/input/filter-by-tags"),
|
||||
tmp_dir.path().to_path_buf(),
|
||||
);
|
||||
let filter_by_tags = filter_by_tags(
|
||||
vec!["private".to_string(), "no-export".to_string()],
|
||||
vec!["export".to_string()],
|
||||
);
|
||||
exporter.add_postprocessor(&filter_by_tags);
|
||||
exporter.run().unwrap();
|
||||
|
||||
let walker = WalkDir::new("tests/testdata/expected/filter-by-tags/")
|
||||
// Without sorting here, different test runs may trigger the first assertion failure in
|
||||
// unpredictable order.
|
||||
.sort_by(|a, b| a.file_name().cmp(b.file_name()))
|
||||
.into_iter();
|
||||
for entry in walker {
|
||||
let entry = entry.unwrap();
|
||||
if entry.metadata().unwrap().is_dir() {
|
||||
continue;
|
||||
};
|
||||
let filename = entry.file_name().to_string_lossy().into_owned();
|
||||
let expected = read_to_string(entry.path()).unwrap_or_else(|_| {
|
||||
panic!(
|
||||
"failed to read {} from testdata/expected/filter-by-tags",
|
||||
entry.path().display()
|
||||
)
|
||||
});
|
||||
let actual = read_to_string(tmp_dir.path().join(PathBuf::from(&filename)))
|
||||
.unwrap_or_else(|_| panic!("failed to read {} from temporary exportdir", filename));
|
||||
|
||||
assert_eq!(
|
||||
expected, actual,
|
||||
"{} does not have expected content",
|
||||
filename
|
||||
);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,7 @@
|
|||
---
|
||||
tags:
|
||||
- export
|
||||
- me
|
||||
---
|
||||
|
||||
A public note
|
|
@ -0,0 +1,6 @@
|
|||
---
|
||||
tags:
|
||||
- export
|
||||
---
|
||||
|
||||
A public note
|
|
@ -11,6 +11,8 @@ Image embed:
|
|||
|
||||
![white.png](white.png)
|
||||
|
||||
![bulb.svg](bulb.svg)
|
||||
|
||||
PDF embed:
|
||||
|
||||
[note.pdf](note.pdf)
|
||||
|
@ -22,7 +24,7 @@ And within a code block:
|
|||
![[note-with-frontmatter]]
|
||||
````
|
||||
|
||||
![\[Not a valid embed.]
|
||||
![\[Not a valid embed.\]
|
||||
|
||||
![\[Partial embed
|
||||
|
||||
|
|
|
@ -0,0 +1,17 @@
|
|||
Obsidian trims space before and after the filename in a wikilink target.
|
||||
These should all be the same:
|
||||
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
[foo](foo.md)
|
||||
|
||||
[foo > ^abc](foo.md#abc)
|
||||
[foo > ^abc](foo.md#abc)
|
||||
[foo > ^abc](foo.md#abc)
|
||||
[foo > ^abc](foo.md#abc)
|
|
@ -1,8 +1,11 @@
|
|||
---
|
||||
foo: bar
|
||||
is_root_note: true
|
||||
bar: baz
|
||||
---
|
||||
|
||||
# Title
|
||||
|
||||
This note is embedded. It mentions the word bar.
|
||||
|
||||
Sentence containing bar.
|
||||
|
|
|
@ -0,0 +1,10 @@
|
|||
---
|
||||
foo: bar
|
||||
is_root_note: true
|
||||
---
|
||||
|
||||
# Title
|
||||
|
||||
This note is embedded. It mentions the word bar.
|
||||
|
||||
Sentence containing foo.
|
|
@ -0,0 +1,10 @@
|
|||
---
|
||||
foo: bar
|
||||
is_root_note: true
|
||||
---
|
||||
|
||||
# Title
|
||||
|
||||
|
||||
|
||||
Sentence containing foo.
|
|
@ -0,0 +1,18 @@
|
|||
# Heading 1
|
||||
|
||||
Here's a random quote from fortune(6):
|
||||
|
||||
"I don't have to take this abuse from you -- I've got hundreds of
|
||||
people waiting to abuse me."
|
||||
-- Bill Murray, "Ghostbusters"
|
||||
|
||||
## Heading 2
|
||||
|
||||
Here's another random quote from fortune(6):
|
||||
|
||||
````
|
||||
Cinemuck, n.:
|
||||
The combination of popcorn, soda, and melted chocolate which
|
||||
covers the floors of movie theaters.
|
||||
-- Rich Hall, "Sniglets"
|
||||
````
|
|
@ -0,0 +1,4 @@
|
|||
This is note B. It links to:
|
||||
|
||||
* [Note A](../Note%20A.md)
|
||||
* [Note C](Note%20C.md)
|
|
@ -0,0 +1,4 @@
|
|||
This is note B. It links to:
|
||||
|
||||
* [Note A](../Note%20A.md)
|
||||
* [Note C](Note%20C.md)
|
|
@ -0,0 +1 @@
|
|||
This is note C.
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
tags: [export, me]
|
||||
---
|
||||
|
||||
A public note
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
tags: [export, no-export, private]
|
||||
---
|
||||
|
||||
A private note
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
tags: [export]
|
||||
---
|
||||
|
||||
A public note
|
|
@ -0,0 +1 @@
|
|||
A note without frontmatter should be exported.
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
tags: [no, no-export]
|
||||
---
|
||||
|
||||
A private note
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
title: foo
|
||||
---
|
||||
|
||||
A public note.
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
tags: [private]
|
||||
---
|
||||
|
||||
A private note.
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="1728" height="1728" viewBox="0 0 1728 1728"><path d="M1065.6 1468.8c0 21.197-17.203 38.4-38.4 38.4H720c-21.197 0-38.4-17.203-38.4-38.4 0-21.196 17.203-38.4 38.4-38.4h307.2c21.196 0 38.4 17.204 38.4 38.4zm-38.4 57.6H720c-25.268 0-44.832 24.403-36.422 50.918 5.068 15.994 21.254 25.882 38.035 25.882h.575c22.177 0 42.45 12.537 52.378 32.37l.403.808c13.38 26.727 40.703 43.622 70.598 43.622h56.063c29.896 0 57.217-16.896 70.58-43.622l.403-.807c9.927-19.833 30.202-32.37 52.378-32.37h.577c16.78 0 32.966-9.888 38.035-25.882 8.43-26.514-11.135-50.918-36.402-50.918zM873.6 297.6c21.197 0 38.4-17.203 38.4-38.4V105.6c0-21.197-17.203-38.4-38.4-38.4-21.196 0-38.4 17.203-38.4 38.4v153.6c0 21.197 17.203 38.4 38.4 38.4zM466.31 443.808c7.49 7.508 17.32 11.252 27.15 11.252s19.66-3.744 27.147-11.252c14.996-14.995 14.996-39.302 0-54.297L411.993 280.897c-14.976-14.995-39.32-14.995-54.297 0-14.995 14.995-14.995 39.302 0 54.297L466.31 443.808zM374.4 796.8c0-21.196-17.203-38.4-38.4-38.4H182.4c-21.197 0-38.4 17.204-38.4 38.4 0 21.197 17.203 38.4 38.4 38.4H336c21.197 0 38.4-17.203 38.4-38.4zm91.91 352.992l-108.613 108.614c-14.995 14.995-14.995 39.303 0 54.298 7.487 7.507 17.318 11.25 27.148 11.25s19.66-3.743 27.148-11.25l108.614-108.614c14.996-14.995 14.996-39.303 0-54.298-14.975-14.995-39.32-14.995-54.296 0zm814.58 0c-14.995-14.995-39.303-14.995-54.298 0s-14.995 39.303 0 54.298l108.614 108.614c7.508 7.507 17.32 11.25 27.15 11.25s19.64-3.743 27.147-11.25c14.995-14.995 14.995-39.303 0-54.298l-108.613-108.614zM1564.8 758.4h-153.6c-21.197 0-38.4 17.203-38.4 38.4 0 21.196 17.203 38.4 38.4 38.4h153.6c21.197 0 38.4-17.204 38.4-38.4 0-21.196-17.203-38.4-38.4-38.4zm-311.06-303.34c9.83 0 19.643-3.744 27.15-11.252l108.613-108.614c14.995-14.995 14.995-39.302 0-54.297s-39.303-14.995-54.298 0L1226.592 389.51c-14.995 14.996-14.995 39.302 0 54.297 7.488 7.508 17.318 11.253 27.15 11.253zM1065.6 1372.8c0 21.197-17.203 38.4-38.4 38.4H720c-21.197 0-38.4-17.203-38.4-38.4 0-20.448 16.032-37.018 36.192-38.17C693.888 1118.555 470.4 1070.42 470.4 816c0-222.682 180.518-403.2 403.2-403.2s403.2 180.52 403.2 403.2c0 254.42-223.488 302.554-247.393 518.63 20.16 1.152 36.193 17.722 36.193 38.17zM785.107 520.512c-5.972-14.727-22.733-21.81-37.518-15.878C649.46 544.396 575.06 629.396 548.58 732c-3.975 15.418 5.3 31.104 20.698 35.078 2.4.634 4.82.922 7.2.922 12.825 0 24.518-8.62 27.878-21.6 21.927-85.018 83.56-155.443 164.852-188.37 14.745-5.972 21.85-22.754 15.897-37.518z"/></svg>
|
After Width: | Height: | Size: 2.5 KiB |
|
@ -10,6 +10,8 @@ Image embed:
|
|||
|
||||
![[white.png]]
|
||||
|
||||
![[bulb.svg]]
|
||||
|
||||
PDF embed:
|
||||
|
||||
![[note.pdf]]
|
||||
|
|
|
@ -0,0 +1,17 @@
|
|||
Obsidian trims space before and after the filename in a wikilink target.
|
||||
These should all be the same:
|
||||
|
||||
[[foo]]
|
||||
[[ foo]]
|
||||
[[foo ]]
|
||||
[[ foo ]]
|
||||
|
||||
[[foo|foo]]
|
||||
[[ foo|foo]]
|
||||
[[foo |foo]]
|
||||
[[ foo |foo]]
|
||||
|
||||
[[foo#^abc]]
|
||||
[[foo#^abc ]]
|
||||
[[foo#^abc |foo > ^abc]]
|
||||
[[ foo#^abc ]]
|
|
@ -4,4 +4,4 @@
|
|||
|
||||
## This is a header
|
||||
|
||||
This is a block ^dda637
|
||||
This is a block ^dda637
|
||||
|
|
|
@ -1,7 +1,10 @@
|
|||
---
|
||||
foo: bar
|
||||
is_root_note: true
|
||||
---
|
||||
|
||||
# Title
|
||||
|
||||
![[_embed]]
|
||||
|
||||
Sentence containing foo.
|
||||
|
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
is_root_note: false
|
||||
---
|
||||
|
||||
This note is embedded. It mentions the word foo.
|
|
@ -0,0 +1,18 @@
|
|||
# Heading 1
|
||||
|
||||
Here's a random quote from fortune(6):
|
||||
|
||||
"I don't have to take this abuse from you -- I've got hundreds of
|
||||
people waiting to abuse me."
|
||||
-- Bill Murray, "Ghostbusters"
|
||||
|
||||
## Heading 2
|
||||
|
||||
Here's another random quote from fortune(6):
|
||||
|
||||
```
|
||||
Cinemuck, n.:
|
||||
The combination of popcorn, soda, and melted chocolate which
|
||||
covers the floors of movie theaters.
|
||||
-- Rich Hall, "Sniglets"
|
||||
```
|
|
@ -1 +1 @@
|
|||
Note in dir2.
|
||||
Note in dir2.
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
This is note A.
|
|
@ -0,0 +1,4 @@
|
|||
This is note B. It links to:
|
||||
|
||||
- [[Note A]]
|
||||
- [[Note C]]
|
|
@ -0,0 +1 @@
|
|||
This is note C.
|
Loading…
Reference in New Issue