Markdown UI
· 8 minutes readSince a few years ago the idea of a Second Brain has been resonating with me. The capability to store an infinite amount of information interesting to me and be able to sort it, categorize it and create relations between items, lets you increase the knowledge in many topics in this world and improve day by day.
After trying and evaluate many solutions out there like Workflowy, Notion and now Obsidian, it is clear to me that the best approach to store this kind of information is simple Markdown files. The simplicity and the capability to backup it somewhere is the main reason to follow this way.
But coming from Notion, I miss some nice features like kanban views, calendars, or tables. Something that doesn’t come out of the box in the Markdown spec.
One cool approach to store complex information in markdowns is by the block of code prefixed by ```. You have seen this popular syntax in many source control platforms to show syntax highlighting of different programing languages.
Goal
In this post, I want to explore and experiment with the possibilities of using this approach to render more rich components to show plain text, like the one required to model a simple interactive table.
The goal after interact with the UI is to be able to store back to plain text into markdown files and close the cycle. This idea is not new and it is implemented in other places like Github todos, kanbans or complex UML diagrams. To shine with this small project, I wanted to play with some technology new to me like WASM.
To understand the project vision you can check the conceptual lifecycle of the information in the followin diagram:
Also, I wanted to refresh my knowledge about lexers and parsers, taking the opportunity to implement them in Rust.
First step: Parsing
To be able to validate the idea and achieve the minimum viable product, we will take the easiest example to play with. We are going to build an interactive markdown table. All values in this table will be floats. This is an example of the content:
```table
2.3,6,9
12,3.44,5
9,2,2
```
First of all, we need a grammar, similar to CSV structure, and we are going to use Parsing Expression Grammar (PEG). This grammar could be implemented in Javascript with a library like pegjs, but since we are implementing this step in Rust, we are going to use pest-rs crate.
The grammar needs to support floats for each one of the cells, called in this
case Field
. Then, a list of floats separated by commas will form the rows of the
table, in this case, called Record
. Then a File
will be a list of Record
separated by break lines:
Field = { (ASCII_DIGIT | "." | "-")+ }
Record = { Field ~ ("," ~ Field)* }
File = { SOI ~ (Record ~ ("\r\n" | "\n"))* ~ EOI }
We need to understand that this grammar doesn’t include the classic Markdown code fence (```) because this responsibility belongs to the parent Markdown parser.
Once we have a nice grammar definition, we can use it to build the parser. Pest is going to build all the utility modules to be able to parse the incoming data to abstract syntax tree nodes.
First, it generates the required Enum
’s to be able to differentiate the
parsed tokens.
pub enum Rule {
EOI,
Field,
Record,
File,
}
On the other hand, it will also generate a parser to build the correct tree of nodes:
extern crate pest;
#[macro_use]
extern crate pest_derive;
use std::fs;
use pest::Parser;
#[derive(Parser)]
#[grammar = "csv.pest"]
pub struct CSVParser;
fn main() {
// A sample how to parse a cell / field from our grammar:
let successful_parse = CSVParser::parse(Rule::Field, "-273.15");
println!("{:?}", successful_parse);
let unparsed_file = fs::read_to_string("./samples/test.csv").expect("cannot read file");
let file = CSVParser::parse(Rule::File, &unparsed_file)
.expect("unsuccessful parse") // unwrap the parse result
.next().unwrap(); // get and unwrap the `file` rule; never fails
// Print the parsed abstract syntax tree
println!("AST: {:?}", file);
}
If we run this code with the test.csv
as input, we are going to get:
Pair {
rule: File,
span: Span { str: "2.3,6,9\n12,3.44,5\n9,2,2\n", start: 0, end: 24 },
inner: [
Pair {
rule: Record,
span: Span { str: "2.3,6,9", start: 0, end: 7 },
inner: [
Pair { rule: Field, span: Span { str: "2.3", start: 0, end: 3 }, inner: [] },
Pair { rule: Field, span: Span { str: "6", start: 4, end: 5 }, inner: [] },
Pair { rule: Field, span: Span { str: "9", start: 6, end: 7 }, inner: [] }
]
},
Pair {
rule: Record,
span: Span { str: "12,3.44,5", start: 8, end: 17 },
inner: [
Pair { rule: Field, span: Span { str: "12", start: 8, end: 10 }, inner: [] },
Pair { rule: Field, span: Span { str: "3.44", start: 11, end: 15 }, inner: [] },
Pair { rule: Field, span: Span { str: "5", start: 16, end: 17 }, inner: [] }
]
},
Pair {
rule: Record,
span: Span { str: "9,2,2", start: 18, end: 23 },
inner: [
Pair { rule: Field, span: Span { str: "9", start: 18, end: 19 }, inner: [] },
Pair { rule: Field, span: Span { str: "2", start: 20, end: 21 }, inner: [] },
Pair { rule: Field, span: Span { str: "2", start: 22, end: 23 }, inner: [] }
]
},
Pair { rule: EOI, span: Span { str: "", start: 24, end: 24 }, inner: [] }
]
}
Second step: Render
Now that we have the required structures to keep the AST of the table, we are going to send them as objects to Javascript world. Those objects will be the source that will feed some UI built on top of libraries like React or Vue. In this case, we are going to follow the React path.
Those DTO’s will be
represented by Rust structs that will hold the information parsed in previous
steps. To be able to go from WASM to Javascript, we need to add Serialize
and
Deserialize
traits to our structures.
use serde::{Serialize, Deserialize};
#[derive(Serialize, Deserialize)]
pub struct Field {
pub value: f32,
}
#[derive(Serialize, Deserialize)]
pub struct Record {
pub fields: Vec<Field>,
}
Thanks to wasm-bindgen and
wasm-pack, we are able to build a
package that includes a declaration of a parse
function that can be called
from Javascript world:
#[wasm_bindgen]
pub fn parse(source: &str) -> JsValue {
let file = CSVParser::parse(Rule::File, source)
.expect("unsuccessful parse") // unwrap the parse result
.next().unwrap(); // get and unwrap the `file` rule; never fails
let mut records = vec!();
for record in file.into_inner() {
match record.as_rule() {
Rule::Record => {
let fields = record.into_inner().map(|field| {
Field {
value: field.as_str().parse::<f32>().unwrap()
}
}).collect();
records.push(Record {
fields
})
}
Rule::EOI => (),
_ => unreachable!(),
}
}
JsValue::from_serde(&records).unwrap()
}
This function will return JsValue
’s with a representation of our Field
and
Record
structures. Those objects will be used to render a nice and
interactive UI build on top of React.
import React from 'react';
const App = () => {
// Retrieve it from markdown
const source = "2.3,6,9\n12,3.44,5\n9,2,2\n"
const [parse, setParse] = React.useState(() => {})
const [records, setRecords] = React.useState([])
React.useEffect(() => {
import('./csvlib').then(({ parse }
) => {
setRecords(parse(source))
setParse(() => parse)
})
}, [])
return (
<div className="App">
<table className="table-auto border-collapse border border-green-800">
<tbody>
{records.map((record, index) => (
<tr key={index}>
{record.fields.map((field, index) => (
<td key={index} className="border-2 border-green-600 px-6 py-2">
{field.value}
</td>
))}
</tr>
))}
</tbody>
</table>
</div>
);
}
export default App;
Third step: Interaction
Now that we are able to render the parsed content, we can continue adding a bit of interactivity to the view. You can imagine whatever nice use case for your widget. From adding buttons, drag and drop, calendars, etc.
For demo purposes, we are going to add just a button to add new rows to the table.
This button will modify the internal state of Javascript objects that represents those records and the UI will react accordingly. As an example, we are going to as to the user a new row of float numbers to be added to the table:
const onClick = () => {
const value = prompt('Add a new row. Floats separated by commas:')
const items = parse(`${value}\n`)
setRecords([...records, ...items])
}
...
<button onClick={onClick}>
<span>Add new row</span>
</button>
Fourth step: Serialization
Finally, the last step in this cycle is to serialize the React state back to
Markdown plain text. To do this, we will call again a WASM function to
compute this. With this approach, we encapsulate all the logic about parsing and
serializing in the same WASM module. To achieve this, we are going to code the
store
function and expose it to wasm_bindgen
:
#[wasm_bindgen]
pub fn store(js_objects: JsValue) -> JsValue {
let records: Vec<Record> = js_objects.into_serde().unwrap();
let value = records
.iter()
.map(|record| {
record
.fields
.iter()
.map(|field| {
field.value.to_string()
})
.collect::<Vec<_>>()
.join(",")
})
.collect::<Vec<_>>()
.join("\n");
JsValue::from_serde(&value).unwrap()
}
Then, we call this new function from React passing as argument all rows in the table from the state:
import React from 'react';
const App = () => {
...
const [store, setStore] = React.useState(() => {})
React.useEffect(() => {
import('./csvlib').then(({ parse, store }
) => {
...
setStore(() => store)
})
}, [])
const onRetrieve = () => {
const source = store(records)
const markdown = `\`\`\`\n${source}\n\`\`\``
console.log('Markdown:')
console.log(markdown)
}
return (
...
);
}
export default App;
And we get the final goal of new Markdown 🎉 for this dynamic table component:
```table
2.3,6,9
12,3.44,5
9,2,2
1,2,3
```
Conclusion
We went through all the steps required to parse arbitrary data to feed a reactive UI and close the cycle back to text. As we said, the possibilities of this approach for markdown documentation or more rich second brains are endless.
To see a working demo widget, check the widget below:
Markdown UI - Table
✨ Check this out 👇
This textarea is interactive. Change the markdown value or add rows to React table. They are in sync
Markdown
React
You can find the demo source code here
In a next post we are going to implement an interactive kanban board with tasks following the same strategy as this one. Stay tuned!
Thanks for reading me and to all people that helped me to polish this article!
Happy hacking!