The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
munsellinterpol is an R package that transforms from Munsell HVC to CIE xyY, and back again. Our package was inspired by Paul Centore’s An Open-Source Inversion Algorithm for the Munsell Renotation [2].
The goals of this package are:
Goals 1, 2, and 3 have been met. This means that extensive tests were passed, with many color samples. There were initial problems with very dark samples where V \(<\) 1, so it is possible there could be some other dark samples that were missed, and where goal 3 fails.
In attempting to meet goal 4, the forward map HVC \(\to\) xyY uses Catmull-Rom spline interpolation in V (by default). But because of a change in Value spacing, the function HVC \(\to\) xyY is not \(C^1\) on the plane V=1. When V \(\ge\) 1 the Value spacing is 1, but when V \(\le\) 1 the Value spacing is 0.2. So goal 4 has not been met. For H and C bicubic interpolation is used, again with Catmull-Rom splines; see [27] and [26]. There are options to use simpler linear interpolation for V, and bilinear interpolation for H and C. These are useful for comparison with other algorithms, such as [16]. Of course, to meet goal 2 the options for forward and inverse maps must be the same.
Note that goal 1 does not include points from all.dat
. This goal is actually met for V \(\ge\) 1, but for V \(<\) 1 we had to ignore the points from all.dat
and re-extrapolate to make inversion work reliably. In general, in this package the xy for these non-real points is further from white than the corresponding points in all.dat
. For the meaning of all.dat
see Appendix A.
The package dependencies are:
The package microbenchmark [15] is suggested, for its high-precision timer.
Similar packages: munsell [21] also does HVC \(\leftrightarrow\) sRGB conversion, but only for discrete HVCs from the Munsell Book; there is no interpolation.
We assume that the reader is familiar with Munsell Hue, Value, and Chroma, which we abbreviate by HVC. A good introduction is [23]. All 3 are stored as floating-point numbers: H is in the interval (0,100], V is in [0,10], and C is non-negative with upper limit a complex function of H and V. These are cylindrical coordinates on the irregular Munsell object color solid. We assume that the reader is familiar with the Munsell character string notation H V/C
for chromatic colors, and N V/
for achromatics (neutrals).
We assume that the reader is familiar with sRGB; a good reference is [22].
We assume that the reader is familiar with the CIE spaces xyY, XYZ, and Lab. Y is the luminance factor and in this domain is considered to be a percentage in the interval [0,100]. A perfectly black surface has Y=0, and a perfectly reflecting diffuser has Y=100. An excellent reference is [10].
There are, and have been, many other software programs that do these conversions. The earliest one we know of is [17] (1960), which ran on an IBM 704. See [2] for a discussion of many other software programs and algorithms.
The forward conversion HVC \(\to\) xyY is as simple as:
MunsellToxyY( '4.2RP 5.5/8' )
## SAMPLE_NAME HVC.H HVC.V HVC.C xyY.x xyY.y xyY.Y
## 1 4.2RP 5.5/8 94.2 5.5 8.0 0.3624688 0.2733678 23.9680791
The return value is a data.frame
. The first column SAMPLE_NAME
is the input Munsell notation. The second column HVC
is its numerical version. And the 3rd column is the output xyY. If the input is a character vector of length N, then the data.frame
has N rows. The input can also be a numeric matrix HVC with N rows. There are many options for advanced users, see the man page for details.
And here is an example of the reverse conversion xyY \(\to\) HVC:
= MunsellToxyY( '4.2RP 5.5/8' )$xyY
xyY xyYtoMunsell( xyY )
## xyY.x xyY.y xyY.Y HVC.H HVC.V HVC.C SAMPLE_NAME
## 1 0.3624688 0.2733678 23.9680791 94.2 5.5 8.0 4.2RP 5.5/8
And so the round-trip has returned us to 4.2RP 5.5/8
. In general, the round-trip is accurate to about 2 decimal places; the worst case is the Hue at near neutrals. Note that the return value is also a data.frame
, but now SAMPLE_NAME
comes at the end.
Other color spaces available are: sRGB, XYZ, Lab, Luv, and Colorlab. All the conversion functions accept multiple “samples”. Some of the functions return a data.frame
and some return a plain matrix. See the Reference Guide for details.
The most obvious thing to plot is a simulation of a page from the Munsell Book of Color:
par( omi=c(0,0,0,0), mai=c(0.6,0.7,0.4,0.2) )
plotPatchesH( "10GY", back='#f7f7f7' )
This is the forward conversion HVC \(\to\) xyY \(\to\) xyY \(\to\) sRGB. There is no interpolation here, unless the Hue is not a multiple of 2.5, which is allowed. For each Value, the Chroma is extended up to the MacAdam limit, and this determines the limit of the Chroma axis. If the sRGB for a patch is outside the cube (before clamping), it is not drawn and the sRGB values are printed instead. This makes it easy to see why the patch is outside the sRGB gamut. The chromatic adaption method, the background color, and the main title can be controlled.
Note that the plot is titled with sRGB, and this plot is best viewed on a display calibrated for sRGB. Adobe RGB space is also available.
par( omi=c(0,0,0,0), mai=c(0.6,0.7,0.4,0.2) )
plotPatchesH( "10GY", space='AdobeRGB', back='#f7f7f7' )
Of course, this one is best viewed on a display calibrated for Adobe RGB. Note that the gamut is quite a bit larger in this figure. Users can add custom RGB spaces too.
In the Munsell arena, another standard plot is the curves of constant Hue and Chroma.
par( omi=c(0,0,0,0), mai=c(0.6,0.7,0.6,0.2) )
plotLociHC( value=8 )
real.dat
(see Appendix A), and the open circles are extrapolated from the “real” ones. The blue curve is the CIE inverted-U, and the red curve is the MacAdam limit for the given Value=8.
One can also plot in the a,b plane.
par( omi=c(0,0,0,0), mai=c(0.6,0.7,0.6,0.2) )
plotLociHC( value=8, coords='ab' )
The interpolation takes place in x,y and is then mapped to a,b. For similar plots, see the Loci of Contant Hue and Chroma vignette.
The ISCC-NBS System is a partition of Munsell Color Solid into 267 color blocks, see [9]. Each block has an ISCC-NBS Name; one of the goals of these names is that they should be “simple enough to be understood by the average person on the street”. Each block is a disjoint union of elementary blocks (of which there are 932) where an elementary block is defined by its minimum and maximum limits in Hue, Value, and Chroma. The peripheral blocks (of which there are 120) have arbitrary large chroma (maximum Chroma is \(\infty\)). Each block has an associated centroid color, see [8], which is in the interior of the block (even though some blocks are non-convex). Given a query point HVC, the function ColorBlockFromMunsell()
searches for the one elementary block that contains that point.
Since 2000, the Pantone Color Institute has declared a “Color of the Year”, see [25]. And they publish the sRGB coordinates of the that color. We thought it would be interesting to compare the Pantone colors with the corresponding ISCC-NBS Names and centroids. Read the data for all colors since 2010.
= system.file( 'extdata/PantoneCoY.txt', package='munsellinterpol' )
path = read.table( path, header=TRUE, sep='\t', strings=FALSE )
pantone pantone
## Year Name Code R G B
## 1 2010 Turquoise 15-5519 69 181 170
## 2 2011 Honeysuckle 18-2120 217 79 112
## 3 2012 Tangerine Tango 17-1463 221 65 36
## 4 2013 Emerald 17-5641 0 148 115
## 5 2014 Radiant Orchid 18-3224 177 99 163
## 6 2015 Marsala 18-1438 149 82 81
## 7 2016 Rose Quartz 13-1520 247 202 201
## 8 2016 Serenity 15-3913 146 168 209
## 9 2017 Greenery 15-0343 136 176 75
## 10 2018 Ultra Violet 18-3838 95 75 139
## 11 2019 Living Coral 16-1546 255 111 97
## 12 2020 Classic Blue 19-4052 15 76 129
## 13 2021 Illuminating 13-0647 245 223 77
## 14 2022 Very Peri 17-3938 102 103 171
Add the block names and centroids.
$Year = NULL ; pantone$Code = NULL
pantone= as.matrix( pantone[ , c('R','G','B') ] )
RGB = RGBtoMunsell( RGB, space='sRGB' )
HVC $Munsell = MunsellNameFromHVC( HVC )
pantone= ColorBlockFromMunsell( HVC )
block "ISCC-NBS Name" ]] = block$Name
pantone[[ "ISCC-NBS Centroid" ]] = block$Centroid
pantone[[ pantone
## Name R G B Munsell ISCC-NBS Name ISCC-NBS Centroid
## 1 Turquoise 69 181 170 4.9BG 6.6/6.8 light bluish green 4.5BG 6.5/5.0
## 2 Honeysuckle 217 79 112 0.16R 5.3/13 strong purplish red 7.0RP 4.5/11.9
## 3 Tangerine Tango 221 65 36 8.5R 5/15 vivid reddish orange 9.5R 5.5/15.5
## 4 Emerald 0 148 115 7.7G 5.3/7.7 strong green 6.0G 4.5/9.1
## 5 Radiant Orchid 177 99 163 9.7P 5.2/11 strong reddish purple 1.0RP 4.5/11.1
## 6 Marsala 149 82 81 4.8R 4.2/6.1 grayish red 4.6R 4.5/4.7
## 7 Rose Quartz 247 202 201 2.9R 8.5/4 light pink 2.5R 8.6/5.2
## 8 Serenity 146 168 209 5.9PB 6.8/6.1 light purplish blue 7.5PB 6.0/6.8
## 9 Greenery 136 176 75 6.4GY 6.6/8.1 strong yellow green 5.0GY 6.0/9.1
## 10 Ultra Violet 95 75 139 0.87P 3.6/8.6 moderate violet 1.0P 3.5/7.2
## 11 Living Coral 255 111 97 6.6R 6.4/13 deep yellowish pink 5.6R 6.0/12.4
## 12 Classic Blue 15 76 129 4.5PB 3.1/8 moderate blue 2.9PB 4.2/7.2
## 13 Illuminating 245 223 77 6.6Y 8.8/9.8 brilliant yellow 4.0Y 8.8/9.5
## 14 Very Peri 102 103 171 8.6PB 4.6/9.4 brilliant purplish blue 8.0PB 6.0/10.8
Note that Emerald
is unfortunately outside the sRGB gamut. Now compare the original color, and the centroid approximation to it. The color match is poor.
= rgb( RGB, max=255 )
color.pant = rgb( MunsellToRGB( block$Centroid, space='sRGB' )$RGB, max=255 )
color.cent = data.frame( row.names=1:nrow(pantone) )
tbl "Pantone Name" ]] = pantone$Name
tbl[[ "Pantone Color" ]] = '' ; tbl[[ "Centroid Color" ]] = ''
tbl[[ "ISCC-NBS Name" ]] = block$Name
tbl[[ library( flextable )
<- regulartable( tbl )
myrt <- align( myrt, j=4, align='left', part='all' )
myrt <- align( myrt, j=2:3, align='center', part='all' )
myrt <- hrule( myrt, rule='exact' )
myrt <- height( myrt, height=0.75 )
myrt <- width( myrt, j=c(1,4), width=2 ) ; myrt <- width( myrt, j=2:3, width=2.5 )
myrt <- fontsize( myrt, size=14, part='all' ) ; myrt <- fontsize( myrt, size=16, part='header' )
myrt for( i in 1:nrow(tbl) )
<- bg(myrt, i=i, j=2, bg=color.pant[i]) ; myrt <- bg(myrt, i=i, j=3, bg=color.cent[i]) }
{ myrt myrt
Pantone Name | Pantone Color | Centroid Color | ISCC-NBS Name |
Turquoise | light bluish green | ||
Honeysuckle | strong purplish red | ||
Tangerine Tango | vivid reddish orange | ||
Emerald | strong green | ||
Radiant Orchid | strong reddish purple | ||
Marsala | grayish red | ||
Rose Quartz | light pink | ||
Serenity | light purplish blue | ||
Greenery | strong yellow green | ||
Ultra Violet | moderate violet | ||
Living Coral | deep yellowish pink | ||
Classic Blue | moderate blue | ||
Illuminating | brilliant yellow | ||
Very Peri | brilliant purplish blue |
This table is best viewed on a display calibrated for sRGB. In the 6 levels of color designation (from coarsest to finest) in [9], ISCC-NBS is Level 3 and Pantone would most likely be Level 4.
Here are a few possible improvements and additions.
The conversions are already pretty fast. On my PC with 3GHz Intel Core Duo, the forward conversion HVC \(\to\) xyY takes about 1 msec/sample, and the reverse xyY \(\to\) HVC takes about 8 msec/sample. This assumes a large number of samples per function call. The next obvious speed-up is to move the interpolation code for the forward conversion from R to compiled C. It would have the added benefit that the 2x2 Jacobian could also be computed during the inversion xyY \(\to\) HVC, which would reduce the number of function evaluations per Newton-Raphson iteration from 3 to 1.
For V \(<\) 1, it might be possible to restore the extrapolated points from all.dat
. But it will probably take better 3D visualization to assist in viewing the xy irregularities in these very dark colors.
For history prior to 1943, please consult [12], [13], and [14].
The OSA Subcommitte on the Spacing of the Munsell Colors issued their Final Report in 1943 [11]. There were 2734 published coordinates (what might be called aim points), for Munsell Values from 1 to 9. The work was based on visual interpolation and smoothing on xy graphs of spectrophotometrically measured physical samples from the Munsell Book of Color (1929). There were 421 samples from [7] and 561 samples from [5]. There were a few obvious transcription errors in this data, which will be discussed below. After 1943, the Munsell Book of Color used these aim points, which are called the renotations.
In 1956 Judd and Wyszecki extended the renotation to Values 0.2/, 0.4/, 0.6/, and 0.8/, in [6]. The Chromas are /1, /2, /3, /4, /6, and /8. However, Chroma /1 was obtained by averaging /2 with white, and /3 was obtained by averaging /2 with /4. If Chromas /1 and /3 are dropped there are 355 very dark samples that remain. There were now 2734 + 355 = 3089 renotations, with even Chroma, in 1956.
In 1958, at the Forty-Third Annual Meeting of the Optical Society of America, a 15-minute talk [19] was given on extrapolation of these renotations for a computer program in development. Since the abstract of [19] is so short, we think it is appropriate to quote it in full:
TA15. Extension of the Munsell Renotation System. J . C. SCHLETER, DEANE B. JUDD, AND H. J. KEEGAN, National Bureau of Standards, Washington, D. C. - In order to convert tristimulus data to Munsell renotations by using the tri-dimensional, linear, interpolation program developed at the NBS for the high-speed, digital IBM 704 computer, it was necessary to extend the published data of the Munsell re-notation system1,2 by extrapolation so that, for each point in Munsell color space, the enclosing six-sided volume would be defined by its eight corners. The approximately 2000 corners to be defined by CIE chromaticity coordinates, x, y, are at a given value level the intersections of the hue and chroma loci that cross the MacAdam limit3 at that value level and at the next lower level. Tentative definitions were found by graphical extrapolation on plots at each of the 14 value levels similar to those published. A second set of tentative definitions were found by plotting x and y against Munsell value for each of the 40 Munsell hues and extrapolating to value 10/. The final definitions were found by adjustment of the two tentative sets to produce smooth loci in both types of plots. Plots at selected value levels will be shown illustrating the extended data. (15 min.)
1 Newhall, Nickerson, and Judd, J. Opt. Soc. Am.33 , 385 (1943).
2 D. B. Judd and G. Wyszecki, J. Opt. Soc. Am.46 , 281 (1956).
3 D. MacAdam, J. Opt. Soc. Am.25 , 361 (1935).
The “2000 corners” is the number of points in the extension; we will call these the Schleter points.
The next talk [16] was a preliminary report on the above-mentioned software program. The authors note that 1) the extended grid from [19] “consists of approximately 5000 points”, 2) the program stores these points on magnetic tape, and 3) an xyY \(\to\) HVC conversion takes “approximately 20 seconds per sample”.
In 1960 there was an updated report [17] on the program. Rheinboldt and Menard report that the 1956 grid of 3089 points was extended by 1907 points to yield 4996 points. So the exact number of Schleter points is 1907. We also learn that the memory of the NBS IBM 704 “had been increased to 8000 words” for “the second version of the code”. This made it possible to store the entire grid in memory. They do not report the improved conversion time per sample.
Skip now to contemporary times. The Program of Color Science at the Rochester Institute Technology has posted 2 files: real.dat
and all.dat
, see [18]. The first file real.dat
is described as “those colors listed the original 1943 renotation article” [11]. There are 2734 samples in the file, so this is an exact match ! Paul Centore found that before June 2010 there were 5 samples missing, but this has now been corrected (in 2018). Note that real.dat
does not include the very dark colors, even though they are inside the MacAdam limits.
all.dat
is described this way:
These are all the Munsell data, including the extrapolated colors. Note that extrapolated colors are in some cases unreal. That is, some lie outsize the Macadam [sic] limits.
This file should be used for those performing multidimensional interpolation to/from Munsell data. You will need the unreal colors in order to completely encompass the real colors, which is required to do the interpolation when near the Macadam limits.
There are 4995 samples in all.dat
, which differs from the count in [17] by only 1. Perhaps the latter count includes Illuminant C ? These two counts are so close that we are confident that the extended grid in the NBS software from 1958 survived, and made its way to the Program of Color Science at RIT, and then into this package. The above-quoted abstract is likely all we will ever know about the methods used to extrapolate the values in all.dat
. Note that all.dat
does include the very dark colors.
Now return to the NBS computer program in [17]. For the forward conversion HVC \(\to\) xy they first determine the two adjacent planes of constant V that the given point is between. In both of these two HC planes they use bilinear interpolation to determine xy. They then linearly interpolate these 2 xy vectors. So a single conversion requires a table lookup of 8 points in general. With this method, the plotted ovoids of constant chroma are 40-sided polygons, and not the smooth ovoids in the published plots by Dorothy Nickerson. For an example, which looks more like a spiderweb, see the Loci of Contant Hue and Chroma vignette.
To get smooth ovoids, the default method in munsellinterpol uses bicubic interpolation in the HC plane. The forward conversion is then class \(C^1\) in H and C (except at C=0). To get \(C^1\) continuity in V as well, the default conversion uses cubic interpolation for 4 planes of constant value - 2 above the input value and 2 below. This requires a table lookup of 64 points in general, and it requires extending the renotations even more ! In the terminology of mathematical morphology the required binary image is the dilation of the 4995 points with a 3x3x3 structuring element. The grid in munsellinterpol currently has 7606 points.
The extrapolation was done, again from the grid of 4995 points, using the function gam()
in the package mgcv which is preinstalled with R itself. It is a thin-plate spline model. A separate model was computed for each of the Munsell values: 0.2,0.4,0.6,0,8,1,…10. Unfortunately, for V=1.5 (on almost any V near 1.5) the forward conversion HVC \(\to\) xy was badly non-injective for small Chroma in the area of the hue GY
. It is because the extrapolated xy values at V=0.8 seem to diverge a lot from V=1 and V=2 (they went too far into halfplane x+y>1). Recall that a conversion at V=1.5 requires lookup at 4 planes V=0.8,1,2, and 3. Also note that the weight at V=0.8 is negative. Cubic interpolation has overshoot and undershoot - and linear interpolation (where weights are non-negative) does not.
So for dark colors V<1 we decided to ignore the extrapolations in [19], and re-extrapolate, using a mgcv::gam()
model, from all real points with V \(\le\) 1. This extrapolation was satisfactory.
There have been at least 4 versions of the renotation tables. In chronological order they are:
all.dat
at [18] mentioned in the previous Appendix
I have noticed 4 discrepancies, and there might be more. They appear to be either simple transcription errors, or adjustments to make the data smoother. This package uses all.dat
, except for 2.5PB 10/2 noted below.
For samples 7.5BG 4/16 and 5BG 1/4 it looks like a simple transcription error in [28], which was corrected in [3] and [18]. For 10PB 8/4 it looks like [3] made a smoothing adjustment, which was carried over to [18]. For 2.5R 9/2 it looks like [18] made a smoothing adjustment.
Finally, in the extrapolated data in [18], we have made a small smoothing adjustment at 2.5PB 10/2 for this package. Before the adjustment, the forward mapping HVC \(\to\) xyY was not injective near N 10/, which caused the root-finding iteration to cycle. This was discovered during testing phase; more ambitious “vetting” software, which specifically searches for non-injectivity, might uncover more of these.
Given an xy target point, the method in [17] searches for the exact quadrilateral that contains xy. It then solves a pair of quadratic equations to determine H and C. For bicubic interpolation, a similar explicit solution is not really feasible. The method in [2] starts with an initial approximation HC\(_0\), and modifies H and C alternately to converge to an HC that maps to xy, within a certain tolerance.
We decided to use a general root-solver from the package rootSolve that uses Newton-Raphson iteration. In our case, this iteration really does double-duty: the first iteration or two locate the quadrilateral that contains xy, and the remaining iterations polish the root. It typically takes about 4 or 5 total iterations. The theory of Newton’s Method only covers functions of class \(C^2\) (see [24]) and our function is at best \(C^1\). However, it is \(C^\infty\) on the quadrilateral interiors, so if the root is in the interior of a quadrilateral, and the iteration gets close enought to the root, then it typically stays inside that quadrilateral and converges. Even when a root is on the boundary of quadrilateral, we have found that the basin of attraction is still large enough to find it. The usual trouble is when the intial estimate is too far from the root, and then the iterations wanders outside the lookup table itself. Cycling is another possible problem - the iteration could jump around from quadrilateral to quadrilateral. But we have only seen this happen once, when the function was non-injective near white (mentioned at the end of the previous section). The quadrilaterals become small triangular slivers there, which causes more trouble near all neutrals.
Instead of HC itself, we decided to use a rectangular system: \[A = C \cos \left( \frac{\pi}{50} H \right) ~~~~~~~~ B = C \sin \left( \frac{\pi}{50} H \right)\] with the inverse: \[H=\left( \frac{50}{\pi} \right) \text{atan2}(B,A) ~~~~~~~~ C = \text{sqrt}( A^2 + B^2 )\]
\(A\) and \(B\) are analogous to \(a\) and \(b\) in \(Lab\) space. The symbols \(a''\) and \(b''\) are used for the same coordinate system in [4] page 207. So now we are trying to invert \(AB \to HC \to xy\). Each iteration requires an extra conversion \(AB \to HC\), but this is very fast.
For the initial estimate, the algorithm in [2] recommends: \[A_0 = a/5.5 ~~~~~~~~~ B_0 = b/5.5\] We use a cubic polynomial in \(a\) and \(b\) for better accuracy. A separate polynomial is computed for each V-plane, using R’s built-in function for linear models with no intercept:
stats::lm( A ~ polym(a,b,degree=3,raw=TRUE) + 0, ... )
and similarly for \(B\). We precompute the model, but only save the coefficients. For V < 1, a cubic polynomial in \(a\) and \(b\) was still not good enough, and it was found that a cubic in \(\Delta x\) and \(\Delta y\) worked better; where \(\Delta x = x - x_C\) and \(\Delta y = y - y_C\), and \(x_C,y_C\) are the chromaticity of Illuminant C. Evaluation of the polynomial takes much longer than division by 5.5, but it is still negligible compared to the iterations to come.
The root-solver is called like this:
rootSolve::multiroot( forwardfun, AB0, rtol=1.e-8, atol=1.e-6, ctol=0, xy_target=xy_target, ...)
The function forwardfun()
performs the equivalent of MunsellToxyY()
but does not call it explicitly. The argument AB0
is the initial estimate. It usually takes 4 or 5 iterations to converge with the indicated tolerances, which are in the xy plane.
R version 4.1.3 (2022-03-10) Platform: x86_64-w64-mingw32/x64 (64-bit) Running under: Windows 10 x64 (build 19044) Matrix products: default locale: [1] LC_COLLATE=C LC_CTYPE=English_United States.1252 [3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C [5] LC_TIME=English_United States.1252 attached base packages: [1] stats graphics grDevices utils datasets methods base other attached packages: [1] flextable_0.6.10 spacesXYZ_1.2-1 spacesRGB_1.5-0 munsellinterpol_3.0-0 loaded via a namespace (and not attached): [1] Rcpp_1.0.7 knitr_1.37 xml2_1.3.3 magrittr_2.0.1 [5] uuid_1.0-3 R6_2.5.1 rlang_0.4.12 fastmap_1.1.0 [9] stringr_1.4.0 highr_0.9 tools_4.1.3 data.table_1.14.2 [13] rootSolve_1.8.2.3 xfun_0.28 htmltools_0.5.2 systemfonts_1.0.3 [17] yaml_2.2.1 digest_0.6.28 zip_2.2.0 officer_0.4.1 [21] microbenchmark_1.4.9 base64enc_0.1-3 evaluate_0.14 rmarkdown_2.11 [25] stringi_1.7.5 compiler_4.1.3 gdtools_0.2.3
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.