Age | Commit message (Collapse) | Author |
|
as documented!
Closes #202.
|
|
|
|
Improve readme a bit
|
|
|
|
|
|
|
|
|
|
|
|
Note, however, that this may not be needed at all:
the old code would have gone into an infinite loop
if the delimiter stack were not already freed.
If we can prove that the delimiter stack is empty
at this point, we could remove this; on the other hand,
it may not hurt to keep it here defensively.
Closes #189.
|
|
|
|
|
|
Closes #188.
@nwellnhof - could you have a look and let me know if you
think this is a bad idea or could be improved?
|
|
needed for multilib distros like Fedora
|
|
The overflow could occur in the following condition:
the buffer ends with `\r` and the next memory address
contains `\n`.
Closes #184.
|
|
Strong now goes inside Emph rather than the reverse,
when both scopes are possible.
The code is much simpler.
This also avoids a spec inconsistency that cmark had previously:
`***hi***` became Strong (Emph "hi")) but
`***hi****` became Emph (Strong "hi")) "*"
|
|
|
|
* Don't double-output the link in latex-rendering.
* Prevent ligatures in dashes sensibly when rendering latex.
\- is a hyphenation, so it doesn't get displayed at all.
* Redo "Don't double-output the link in latex-rendering."
This reverts commit 8fb1f1c3c8799628141780ca5fd8d70883c1ec53
and adds the proper solution to the problem.
With commit 8fb1f1c3c double rendering is fixed, but the url isn't
escaped anymore, so I discarded the wrong copy.
We now return 0 from the function in case of a single link,
which stops processing the contents of the node.
* Add a comment about the double-rendering issue addressed in 1c0d4749451cf85a849a3cf8e41cf137789821d4
|
|
CMake impovement
|
|
Now you can enable/disable compilation and installation targets for
shared and static libraries via -DCMARK_SHARED=ON/OFF and
-DCMARK_STATIC=ON/OFF
|
|
Replaced ${LIB_INSTALL_DIR} option with built-in ${LIB_SUFFIX} for installing
for 32/64-bit systems. Normally, CMake will set ${LIB_SUFFIX} automatically
for required enviroment.
If you have any issues with it, you can override this option with
-DLIB_SUFFIX=64 or -DLIB_SUFFIX="" during configuration.
|
|
|
|
Noticed the need for this through fuzzing.
|
|
We now use a much smaller array.
|
|
|
|
The new "multiple of 3" rule defeats one of our optimizations.
|
|
|
|
|
|
This reverts commit 9e643720ec903f3b448bd2589a0c02c2514805ae.
|
|
This reverts commit c4c1d59ca29aceb1c5919908ac97d9476264fd96.
|
|
This reverts commit 26182bb868d3da7dd8a3389729bea79d489855b7.
|
|
This reverts commit 4fbe344df43ed7f60a3d3a53981088334cb709fc.
|
|
We need to store the length of the original delimiter run,
instead of using the length of the remaining delimiters
after some have been subtracted.
Test case:
a***b* c*
Thanks to Raph Levin for reporting.
|
|
|
|
* Improve strbuf guarantees
Introduce BUFSIZE_MAX macro and make sure that the strbuf implementation
can handle strings up to this size.
* Abort early if document size exceeds internal limit
* Change types for source map offsets
Switch to size_t for the public API, making the public headers
C89-compatible again.
Switch to bufsize_t internally, reducing memory usage and improving
performance on 32-bit platforms.
* Make parser return NULL on internal index overflow
Make S_parser_feed set an error and ignore subsequent chunks if the
total input document size exceeds an internal limit. Make
cmark_parser_finish return NULL if an error was encountered. Add
public API functions to retrieve error code and error message.
strbuf overflow in renderers and OOM in parser or renderers still
cause an abort.
|
|
|
|
* open_new_blocks: always create child before advancing offset
* Source map
* Extent's typology
* In-depth python bindings
|
|
The `alloc` member wasn't initialized.
This also allows to add an assertion in `chunk_rtrim` which doesn't
work for alloced chunks.
|
|
|
|
|
|
|
|
|
|
|
|
1. Downloaded CaseFolding.txt from http://unicode.org/Public/UCD/latest/ucd/CaseFolding.txt
2. Deleted src/case_fold_switch.inc
3. Ran `make src/case_fold_switch.inc`
|
|
|
|
|
|
Currently aborts.
|
|
|
|
- Removed recursion in scan_to_closing_backticks
- Added an array of pointers to potential backtick closers
to subject
- This array is used to avoid traversing the subject again
when we've already seen all the potential backtick closers.
- Added a max bound of 1000 for backtick code span delimiters.
- This helps with pathological cases like:
x
x `
x ``
x ```
x ````
...
Thanks to Martin Mitáš for identifying the problem and for
discussion of solutions.
|
|
Closes #163, thanks to @kainjow.
|
|
|