Go "players" may be very confused when they see this topic-for JSON, the Go native library encoding/json
already provides a comfortable enough JSON processing tool and is widely praised by Go developers. What can still be wrong? However, in the process of business development, we encountered a lot of problems that the original json
could not be done well or even impossible, and it really couldn't fully meet our requirements.
So, is there any problem with it? When to use third-party libraries? How to choose? How is the performance?
However, before throwing specific questions, let's first understand as briefly as possible some of the libraries that Go currently uses in processing JSON, as well as the test data analysis of these libraries. If the reader feels that the text below is too long, you can skip directly to the conclusion part.
Some commonly used Go JSON parsing libraries
Go native encoding/json
This should be the library most familiar to Go programmers. Using the json.Unmarshal
and json.Marshal
functions, you can easily deserialize binary data in JSON format into a specified Go structure, and serialize the Go structure into a binary stream. For data with unknown structure or uncertain structure, it is supported to deserialize the binary into map[string]interface{}
type, and use the KV mode for data access.
Here I mention two additional features that you may not notice:
- The json package parses a JSON data, and the JSON data can be either an object, an array, or a string, a number, a boolean, and an empty. Value (null). The above two functions actually support the analysis of these types of values. For example, the following code can also be used
var s string
err := json.Unmarshal([]byte(`"Hello, world!"`), &s)
// 注意字符串中的双引号不能缺,如果仅仅是 `Hello, world`,则这不是一个合法的 JSON 序列,会返回错误。
- When json is parsed, if it encounters a case problem, the case will be converted as much as possible. Even if a key is different from the definition in the structure, if it is the same after ignoring the case, you can still assign values to the field. For example, the following example can illustrate:
cert := struct {
Username string `json:"username"`
Password string `json:"password"`
}{}
err := json.Unmarshal([]byte(`{"UserName":"root","passWord":"123456"}`), &cert)
if err != nil {
fmt.Println("err =", err)
} else {
fmt.Println("username =", cert.Username)
fmt.Println("password =", cert.Password)
}
// 实际输出:
// username = root
// password = 123456
jsoniter
Open jsoniter's GitHub homepage , it promotes two keywords when it comes up: high-performance and compatible . These are also the two biggest selling points of this bag.
jsoniter
me talk about compatibility first: The biggest advantage of 0609b519485479 is that it is 100% compatible with the standard library, so the code can be easily migrated. It is really inconvenient. You can also use Go Monkey to forcibly change the json related function entry.
Then look at performance: Like other open source libraries that boast about their performance, their own test conclusions can't be accepted without thinking. Here I will throw a few simple conclusions based on my personal test situation:
- In the single scenario of deserialization structure, jsoniter is indeed improved compared to the standard library. The result I measured by myself is about 1.4 times higher.
- But also in the single scene of deserialization structure, jsoniter is far inferior to easyjson
- Other scenes are not necessarily true, I will explain later
In terms of performance, jsoniter
is faster than the official library jointly developed by many great gods. One is to minimize unnecessary memory copying, and the other is to reduce the use of reflect-for the same type of object, jsoniter only calls reflect analysis It is cached after one time. However, with the iteration of the go version, the performance of the native json library is getting higher and higher, and the performance advantage of jsonter is getting narrower.
In addition, jsoniter also supports the Get
function, which supports []byte
binary data, which will be explained later
easyjson
This is another json parsing package on GitHub. Compared with jsoniter's 9k star, easyjson seems to be a bit less, with 3k, but it is actually a very popular open source project.
The main selling point of this bag is still fast. Why is easyjson faster than jsoniter? Because the development model of easyjson is similar to protobuf, you need to use its code tool to generate serialization/deserialization program code for each structure before the program runs. Each program has a customized analysis function.
But also because of this development model, easyjson is more intrusive to the business. On the one hand, go build
; on the other hand, the related json processing functions are not compatible with the native json library.
jsonparser
This is a json parsing library that I personally like very much, and the number of stars of 3.9k shows that its popularity is not low. Its GitHub homepage claims to have up to 10x performance than the official library.
Still the same sentence: open source project's own test conclusions can not be adopted without thinking. I personally measured this 10x performance, but it does not represent all scenarios.
Why does jsonparser have such high performance? Because for jsonparser itself, it is only responsible for deconstructing some key boundary characters in a binary byte string, for example:
- Find
"
, then find the ending"
, which is a string in the middle - Find
[
, then find the pair of]
, which is an array in the middle - Find
{
, then find the pair of}
, which is an object in the middle - ……
[]byte
data in the middle of the found data to the caller for further processing. At this time, the caller is responsible for the analysis and legality check of these binary data.
Why would I like an open source library that looks so troublesome? Because developers can build special logic based on jsonparser, or even build their own json parsing library. My own open source project jsonvalue was also implemented based on jsonparser in the early days, although jsonparser was later abandoned to further optimize performance, but this does not affect my respect for it.
jsonvalue
This project is my personal JSON parsing library. It was originally designed to replace the native JSON library and use map[string]interface{}
to process unstructured JSON data. For this reason, I have another article describing this question: "[Still using map[string]interface{} to process JSON? Tell you a more efficient method-jsonvalue][2]".
I have roughly completed the optimization of the library (see the master branch), and the performance is far higher than the native json library, and slightly better than jsoniter. Of course, this is also under specific circumstances, and the performance of various libraries is different for a variety of very different scenarios. This is also one of my purposes for writing this article.
JSON processing under normal operations
In addition to struct and map, what else? Below is a list of the scenarios I encountered in actual business development for readers. All test codes are open source . Readers can check it out or give me opinions, including issues, comments, and private chats.
Routine operation: structure analysis
Structure parsing, this is the most common operation for processing JSON in Go. Here I define such a structure :
type object struct {
Int int `json:"int"`
Float float64 `json:"float"`
String string `json:"string"`
Object *object `json:"object,omitempty"`
Array []*object `json:"array,omitempty"`
}
It's a little bit bad-this structure can be madly self-nesting.
Then, I defined a binary stream again, using json.cn to see that this is a json object with a 5-layer structure.
{"int":123456,"float":123.456789,"string":"Hello, world!","object":{"int":123456,"float":123.456789,"string":"Hello, world!","object":{"int":123456,"float":123.456789,"string":"Hello, world!","object":{"int":123456,"float":123.456789,"string":"Hello, world!","object":{"int":123456,"float":123.456789,"string":"Hello, world!"},"array":[{"int":123456,"float":123.456789,"string":"Hello, world!"},{"int":123456,"float":123.456789,"string":"Hello, world!"}]}}},"array":[{"int":123456,"float":123.456789,"string":"Hello, world!"},{"int":123456,"float":123.456789,"string":"Hello, world!"}]}
Use these two structures, respectively for the official encoding/json
, jsoniter
. easyjson
are tested for Marshal and Unmarshal. First, let's look at the test results of Deserialization (Unmarshal):
Package names | function | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation |
---|---|---|---|---|---|
encoding/json | Unmarshal | 8775 ns/op | 1144 B/op | 25 allocs/op | ★★ |
jsoniter | Unmarshal | 6890 ns/op | 1720 B/op | 56 allocs/op | ★★☆ |
easyjson | UnmarshalJSON | 4017 ns/op | 784 B/op | 19 allocs/op | ★★★★★ |
The following are the serialized test results:
Package names | function | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation |
---|---|---|---|---|---|
encoding/json | Marshal | 6859 ns/op | 1882 B/op | 6 allocs/op | ★★ |
jsoniter | Marshal | 6843 ns/op | 1882 B/op | 6 allocs/op | ★★ |
easyjson | MarshalJSON | 2463 ns/op | 1240 B/op | 5 allocs/op | ★★★★★ |
From a purely performance point of view, easyjson
deserves to be customized serialization and deserialization functions for each struct. It achieves the highest performance and is 2.5 to 3 times more efficient than the other two libraries. The jsoniter is slightly higher than the official json, but the difference is not much.
Conventional non-conventional operations: map[string]interface{}
The reason for saying it is "unconventional" is that in this case, the program needs to process unstructured JSON data, or process multiple different types of data structures in a function, so it cannot be processed in structure mode. The solution of the official JSON library is to use map[string]interface{}
to save (for the object type). In this scenario, only official json and jsoniter support.
The test data is as follows, first is deserialization:
Package names | function | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation |
---|---|---|---|---|---|
encoding/json | Unmarshal | 13040 ns/op | 4512 B/op | 128 allocs/op | ★★ |
jsoniter | Unmarshal | 9442 ns/op | 4521 B/op | 136 allocs/op | ★★ |
The serialization test data is as follows:
Package names | function | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation |
---|---|---|---|---|---|
encoding/json | Marshal | 17140 ns/op | 5865 B/op | 121 allocs/op | ★★ |
jsoniter | Marshal | 17132 ns/op | 5865 B/op | 121 allocs/op | ★★ |
It can be seen that in this situation, everyone is half-hearted, and jsoniter does not have any obvious advantages. Even if jsoniter is used as a selling point for analyzing large amounts of data, the advantages are minimal.
In the case of the same amount of data, the deserialization time of the two libraries is basically twice that of the structure case, and the serialization time is about 2.5 times that of the structure case.
Emmm... If you don't need this kind of operation, don't use it. What's more, the program interface{}
. You can read my article feel the pain.
Unconventional operations-deserialization
When it comes to the inability to use struct, various open source projects have their own magical powers. Each library actually has very detailed and powerful additional functions, and this article alone will certainly not be able to finish. Here I will list a few libraries and their representative ideas, and the test data of various situations will be attached later.
jsoniter
In processing unstructured JSON, if you want to parse a piece of []byte
data and obtain one of its values, jsoniter has the following similar solutions.
The first solution is to directly parse the original text and return the required data:
// 读取二进制数据中 response.userList 数组中的第一个元素的 name 字段
username := jsoniter.Get(data, "response", "userList", 0, "name")
fmt.Println("username:", username.ToString())
It can also return an object directly, and continue to operate based on the object:
obj := jsoniter.Get(data)
if obj.ValueType() == jsoniter.InvalidType {
// err handling
}
username := obj.Get("response", "userList", 0, "name")
fmt.Println("username:", username.ToString())
This function has a very big feature, that is, on-demand analysis. For example, in this statement obj := jsoniter.Get(data)
, jsoniter only performs a minimum data check, at least first parsed out that it is currently an object type JSON, and does not perform other parsing.
And even in the second call obj.Get("response", "userList", 0, "name")
, jsoniter tries its best to reduce unnecessary parsing and only parses the part that needs to be parsed.
For example, the request parameters requires resolution of response.userList
value, then jsoniter in the face of such response.gameList
and other unrelated fields of time, then went back to try to bypass the jsoniter not to deal with, so as to reduce as much as possible independent of the CPU time.
However, it should be noted that the returned obj
object can be understood as read-only from the interface function and cannot be re-serialized into a binary sequence.
jsonparser
Compared with jsoniter, jsonparser has limited support to parse a piece of []byte
data and obtain one of its values.
For example, if we can realize the type of a certain value, such as the username field above, then we can get it like this:
username, err := jsonparser.GetString(data, "response", "userList", "[0]", "name")
if err != nil {
// err handling
}
fmt.Println("username:", username)
But the Get series functions of jsonparser can only get the basic types except null, that is, number, boolean, and string.
If you want to manipulate object and array, you must be familiar with the following two functions, which I personally think are the core of jsonparser:
func ArrayEach(data []byte, cb func(value []byte, dataType ValueType, offset int, err error), keys ...string) (offset int, err error)
func ObjectEach(data []byte, callback func(key []byte, value []byte, dataType ValueType, offset int) error, keys ...string) (err error)
These two functions parse the binary data in sequence, and return the extracted data segment to the caller through the callback function, and the caller operates on the data. The caller can group maps, can group slices, and can even do some operations that are normally inoperable (explained later)
jsonvalue
This is an open source Go JSON manipulation library developed by myself. The API design style of Get operation is similar to the second style of jsoniter.
For example, we also want to get the username field mentioned earlier, then we can get it like this:
v, err := jsonvalue.Unmarshal(data)
if err != nil {
// err handling
}
username := v.GetString("response", "userList", 0, "name")
fmt.Println("username:", username)
Performance test comparison
In the "unconventional operation" scenario mentioned in this section, among the three libraries, jsoniter and jsonparser are both "analyzed on demand" during parsing, while jsonvalue is a comprehensive parsing. Therefore, there are still differences when making test plans.
Here I first throw out the test data, there are two parts in the test evaluation:
- Performance Evaluation: Indicates the performance score in this scenario, regardless of whether it is easy to use, only considering whether the CPU execution efficiency is high
- Function evaluation: Indicates whether the subsequent processing of the program is convenient after obtaining the data in this scenario. Regardless of whether the deserialization performance is high or not
Package names | Function description / main function call | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation | Functional evaluation |
---|---|---|---|---|---|---|
Simple analysis | ||||||
jsoniter | any := jsoniter.Get(raw); keys := any.Keys() | 9118 ns/op | 3024 B/op | 139 allocs/op | ☆ | ★★★ |
jsonvalue | jsonvalue.Unmarshal() | 7684 ns/op | 9072 B/op | 61 allocs/op | ★ | ★★★★★ |
jsonparser | jsonparser.ObjectEach(raw, objEach) | 853 ns/op | 0 B/op | 0 allocs/op | ★★★★★ | ★★ |
Read one of the deeper data | ||||||
jsoniter | any.Get("object", "object", "object", "array", 1) | 9118 ns/op | 3024 B/op | 139 allocs/op | ☆ | ★★★★★ |
jsonvalue | jsonvalue.Unmarshal(); v.Get("object", "object", "object", "array", 1) | 7928 ns/op | 9072 B/op | 61 allocs/op | ★ | ★★★★★ |
jsonparser | jsonparser.Get(raw, "object", "object", "object", "array", "[1]") | 917 ns/op | 0 B/op | 0 allocs/op | ★★★★★ | ★★☆ |
Read only one of the deeper values from a large amount of (100x) data | ||||||
jsoniter | jsoniter.Get(raw, "10", "object", "object", "object", "array", 1) | 29967 ns/op | 4913 B/op | 469 allocs/op | ★ | ★★★★★ |
jsonvalue | jsonvalue.Unmarshal(); v.Get("10", "object", "object", "object", "array", 1) | 799450 ns/op | 917030 B/op | 6011 allocs/op | ★★★★★ | |
jsonparser | jsonparser.Get(raw, "10", "object", "object", "object", "array", "[1]") | 8826 ns/op | 0 B/op | 0 allocs/op | ★★★★★ | ★★☆ |
Complete traversal | ||||||
jsoniter | jsoniter.Get(raw) and recursively analyze each child | 45237 ns/op | 12659 B/op | 671 allocs/op | ☆ | ★★ |
jsonvalue | jsonvalue.Unmarshal() | 7928 ns/op | 9072 B/op | 61 allocs/op | ★★★ | ★★★★★ |
jsonparser | jsonparser.ObjectEach(raw, objEach) and recursively analyze each child | 3705 ns/op | 0 B/op | 0 allocs/op | ★★★★★ | ☆ |
encoding/json | Unmarshal | 13040 ns/op | 4512 B/op | 128 allocs/op | ★ | ★ |
jsoniter | Unmarshal | 9442 ns/op | 4521 B/op | 136 allocs/op | ★☆ | ★ |
It can be seen that there are four deserialization scenarios in the above test data. Here I explain in detail the application scenarios of the four situations and the corresponding technical selection suggestions
Simple analysis
In the test code, shallow analysis refers to a deeper level structure, only the most shallow key list is parsed out. This scene is more for reference. It can be seen that the performance of jsonparser has exploded other open source libraries, and it can parse the first-level key list as quickly as possible.
However, in terms of ease of use, both jsonparser and jsoniter require developers to further process the obtained data, so the ease of use of jsoniter and jsonparser is slightly lower in this scenario.
Get specific data in the text
This scenario is like this: In the body of the JSON data, only a small part of the data is useful for the current business and needs to be obtained. Here I have divided two situations:
Useful data accounts for a higher proportion of all data (corresponding to "read one of the deeper data"):
- In this scenario, from the performance point of view, the performance of jsonparser is as superior as ever
- From the perspective of ease of use, jsonparser requires the caller to process the data again, so jsoniter and jsonvalue are better
The ratio of useful data to all data is low (corresponding to "read only one of the deeper level values from a large amount of (100x) data"):
- From the performance point of view of this scene, jsonprser is still completely exploded
- From the perspective of ease of use, jsonparser is still weak
- Comprehensive ease of use and performance, in this scenario, the lower the proportion of useful data, the higher the value of jsonparser
The business requires complete analysis of data-this scenario is the most complete consideration of the comprehensive performance of each solution
- From a performance point of view, jsonparser is still excellent, but in this scenario, ease of use is actually a problem-in the complex traversal operation, it is necessary to encapsulate logic to store data
The second performance is jsonvalue, which is also where the author is very confident
- jsonvalue completes all and complete analysis, and its time-consuming is lower than the so-called high-speed jsoniter
- Compared with jsonparser, although the processing time of jsonvalue on the surface is 2.5 times that of jsonparser, the latter only completes the semi-processing of the data, while the former takes out the finished product for the caller to use.
- As for jsoniter, don't use it in this scenario-in the case of full analysis of the data, its data is almost impossible to see
- Finally, the official json library and jsoniter parsing map data are added for reference only-in this scenario, it is also recommended not to use
Unconventional operations-serialization
This refers to serializing a piece of data without a structure. This scenario generally occurs in the following situations:
- The format of the data to be serialized is uncertain and may be generated based on other parameters.
- The data that needs to be serialized is too much and too trivial. If the structure is defined one by one and marshaled, the readability of the code is too poor.
The first solution to this scenario is the "conventional unconventional operation" mentioned above, which is to use map.
As for unconventional operations, we first exclude jsoniter and jsonparser, because they do not have a direct way to build a custom json structure. Then exclude easyjson because it cannot operate on map. The only thing left is jsonvalue.
For example, if we return the user’s nickname, suppose the return format is: {"code":0,"message":"success","data":{"nickname":"Revitalizing China"}}. The code using map is as follows:
code := 0
nickname := "振兴中华"
res := map[string]interface{}{
"code": code,
"message": "success",
"data": map[string]string{
"nickname": nickname
},
}
b, _ := json.Marshal(&res)
And the method of jsonvalue is:
res := jsonvalue.NewObject()
res.SetInt(0).At("code")
res.SetString("success").At("message")
res.SetString(nickname).At("data", "nickname")
b := res.MustMarshal()
It should be said that it is very convenient in terms of ease of use. We serialize the official json, jsoniter, and jsonvalue respectively, and the measured data are as follows:
Package names | function | Time consuming per iteration | Memory footprint | alloc number | Performance evaluation |
---|---|---|---|---|---|
encoding/json | Marshal | 16273 ns/op | 5865 B/op | 121 allocs/op | ★☆ |
jsoniter | Marshal | 16616 ns/op | 5865 B/op | 121 allocs/op | ★☆ |
jsonvalue | Marshal | 4521 ns/op | 2224 B/op | 5 allocs/op | ★★★★★ |
The results are already very obvious. Everyone can understand this reason, because when processing the map, you need to use the reflect mechanism to process the data type, which greatly reduces the performance of the program.
Conclusions and selection suggestions
Structure serialization and deserialization
In this scenario, my first recommendation is the official json library. Perhaps the reader will be more surprised. The following is my opinion:
- Although the performance of easyjson overwhelms all other open source projects, it has one of the biggest flaws, that is, it requires additional tools to generate this code, and the version control of this additional tool adds a bit of operation and maintenance costs. Of course, if the reader’s team has been able to handle protobuf well, then the same idea can be used to manage easyjson
- Before Go 1.8, the performance of the official json library has received criticism from many sources. But now (1.16.3) the performance of the official json library is not the same. In addition, as the most widely used (no one) json library, the official library has the fewest bugs and the best compatibility.
- Although the performance of jsoniter is still better than that of the official, it has not reached the level of sky-defense. If you want the ultimate performance, then you should choose easyjson instead of jsoniter
- jsoniter has been inactive in recent years. The author mentioned a issue some time ago and no one responded. Later, I looked at the issue list and found that there are still some issues from 2018.
Serialization and deserialization of unstructured data
In this scenario, we have to look at two situations: high data utilization and low data utilization. The so-called data utilization rate refers to the body of JSON data. If more than a quarter of the data needs to be paid attention to and processed by the business, it is considered high data utilization rate.
- High data utilization-in this case, I recommend using jsonvalue
Low data utilization-There are two cases here: Does the JSON data need to be re-serialized back?
- No need to re-serialize: At this time, just choose jsonparser, its performance is really dazzling
- Need to re-serialize: In this case, there are two options. If the performance requirements are relatively low, you can use jsonvalue; if the performance requirements are high, and you only need to insert only one data (important) into the binary sequence, then you can Use jsoniter's
Set
method. Readers can refer to godoc
In actual operation, there are very few cases of large JSON data volume and need to be re-serialized at the same time. In this scenario, it is often used when proxy servers, gateways, overlay relay services, etc., and additional information needs to be injected into the original data. In other words, the applicable scenarios of jsoniter are relatively limited.
The following is a comparison of the operating efficiency of different libraries from 1% to 60% data coverage (ordinate unit: μs/op)
It can be seen that when the data utilization rate of jsoniter reaches 25%, there is no advantage over jsonvalue; while jsonparser is about 40%. As for jsonvalue, due to the one-time full analysis of the data, the data access after the analysis takes very little time, so the time consumption is very stable under different data coverage.
Other evil operations
The author has encountered some weird processing scenarios about JSON in practical applications. I also took this opportunity to list them and share my solutions.
Case-insensitive JSON
As mentioned above: "When json is parsed, if it encounters a case problem, it will do its case conversion as much as possible. Even if a key is different from the definition in the structure, if it is the same after ignoring the case, then it will still Able to assign values to fields."
But if you are using map, jsoniter, jsonparser, this is a big problem. We have two services that operate on the same field in the MySQL database at the same time, but in the structure defined by the two Go services, the case of one letter is inconsistent. This problem has existed for a long time, but because of the above-mentioned characteristics of the official json parsing structure, this problem has not been exposed. Until one day, when we wrote a script program to wash the data and used the map method to read this field, the bug was exposed.
So I later added the feature of case support to jsonvalue to solve this problem:
raw := `{"user":{"nickName":"pony"}}` // 注意当中的 N
v, _ := jsonvalue.UnmarshalString(raw)
fmt.Println("nickname:", v.GetString("user", "nickname"))
fmt.Println("nickname:", v.Caseless().GetString("user", "nickname"))
// 输出
// nickname:
// nickname: pony
Ordered JSON object
In the interface of the cooperating brother module, when the other party pushes the data stream, it is given to our business module in the format of a JSON object. Later, according to demand, the data requirements pushed over are in order. If the interface format is changed to an array, then the data structure of the two interfaces needs to be changed significantly. In addition, we are bound to encounter the situation where the old and new modules coexist during the rolling upgrade, so the interface needs to be compatible with two sets of interface formats.
Finally, we have adopted a very depraved way - data producers in order to be able to launch KV, and we, as a consumer, use jsonparser
of ObjectEach
function, it is possible to obtain a sequence of bytes kv order, and thus complete the data Get in the order of.
Cross-language UTF-8 string docking
Go is a very young language. When it was born, the mainstream character encoding on the Internet was already unicode, and the encoding format was UTF-8. Other languages with older generations may use different encoding formats for various reasons.
This leads to the fact that different teams and companies may use different encoding formats for unicode wide characters when performing cross-language JSON docking. If you encounter this situation, the solution is to uniformly use ASCII encoding. If it is the official json, you can refer to the question and answer escape wide characters.
If you are using jsonvalue, the default is ascii escape, for example:
v := jsonvalue.NewObject()
v.SetString("中国").At("nation")
fmt.Println(v.MustMarshalString())
// 输出
// {"nation":"\u4E2D\u56FD"}
Reference
Open source libraries involved in this article:
- jsoniter
- rapidjson : This library is another library learned in the learning process, implemented by cgo. Personally, this is a bit too much. If the performance requirements are really so high, it is better to use C++ directly. And this library is no longer iterative, readers just need to know it.
- jsonparser
- easyjson
- jsonvalue
- Go monkeypatching
For the test data and test methods involved in this article, see:
- Escaping and Unicode encoding in JSON serialization
- known as the fastest JSON parser in the world, 10x faster than others
- json-iterator/go usage notes
- evaluate jsoniter which claims to be the fastest JSON parser?
- to pay attention to
- Go Learning_28_ Use easyjson to efficiently parse json data
This article uses the Creative Commons Attribution-Non-Commercial Use-Same Method Sharing 4.0 International License Agreement for licensing.
Links to this article: https://segmentfault.com/a/1190000039957766
The original author: amc , the original text was published in Cloud + Community , and it is also the blog himself Welcome to reprint, but please indicate the source.
Original title: "What's wrong with the native json package of Go language?" How to deal with JSON data better? 》
Release Date: 2021-05-06
Original link: https://cloud.tencent.com/developer/article/1820473 .
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。