Reputation: 1068
I'm having some trouble parsing a static/dynamic YAML file with Go. This is the YAML file I'm using
#this part is fixed
stack:
platformName: 'default'
stack-image-name: 'stack-v1'
stack-image-version: '1.5.2'
structure: 'flat'
#This part is dynamic and can change
# ===== Apache Zookeeper ========
ZOOKEEPER_enable: false
ZOOKEEPER_volume_map_data: false
ZOOKEEPER_nodes: 1 # either 1 or 3
ZOOKEEPER_navigator_enable: false
# ===== Apache Kafka ========
KAFKA_enable: false
# one of enterprise, community
KAFKA_edition: 'community'
KAFKA_volume_map_data: false
KAFKA_broker_nodes: 3
KAFKA_delete_topic_enable: false
KAFKA_auto_create_topics_enable: false
#kafka schema registry
KAFKA_SCHEMA_REGISTRY_enable: false
KAFKA_SCHEMA_REGISTRY_nodes: 1
KAFKA_SCHEMA_REGISTRY_use_zookeeper_election: false
KAFKA_SCHEMA_REGISTRY_replication_factor: 1
#kafka connect
KAFKA_CONNECT_enable: false
KAFKA_CONNECT_nodes: 2
#ksqldb
KAFKA_KSQLDB_enable: false
KAFKA_KSQLDB_edition: 'oss'
KAFKA_KSQLDB_nodes: 1
#confluent control center
KAFKA_CCC_enable: false
KAFKA_RESTPROXY_enable: false
KAFKA_MQTTPROXY_enable: false
KAFKA_SCHEMA_REGISTRY_UI_enable: false
KAFKA_TOPICS_UI_enable: false
KAFKA_CONNECT_UI_enable: false
KAFKA_CMAK_enable: false
KAFKA_KAFDROP_enable: false
KAFKA_KADMIN_enable: false
KAFKA_AKHQ_enable: false
KAFKA_BURROW_enable: false
# ===== Hadoop ========
HADOOP_enable: false
HADOOP_datanodes: 2
# ===== Spark ========
SPARK_enable: false
# "hive" or "in-memory"
SPARK_catalog: in-memory
SPARK_workers: 2
SPARK_jars_packages: ''
SPARK_jars_ivySettings: ''
SPARK_driver_extraJavaOptions: ''
SPARK_executor_extraJavaOptions: ''
below are the structs
type YAMLFile struct {
Stack Stack `yaml:"stack"`
Services map[string]interface{}
}
type Stack struct {
PlatformName string `yaml:"platformName"`
StackImageName string `yaml:"stack_image_name"`
StackImageVersion string `yaml:"stack-image-version"`
Structure string `yaml:"structure"`
}
This is how I'm parsing the file
var yamlFile YAMLFile
if configFile == "" {
log.Fatal("Unable to run command as configFile is null")
return
}
ymlContent, err := ioutil.ReadFile(configFile)
if err != nil {
panic(err)
}
err = yaml.Unmarshal(ymlContent, &yamlFile)
The content under the stack is properly parsed into the stack struct however I cannot find the way to have the rest to be parsed into a dynamic map or collection; I've tried map[interface{}]interface{}
as well as a struct or map[string]interface{}
but the Stack property is always nil
Any ideas? I found a lot of posts about parsing YAML but none worked for me
Upvotes: 0
Views: 2130
Reputation: 2626
You can't use the struct you have defined to have the rest of the yaml fields to be parsed into a map in the way you want
This is because your struct is of the form,
type YAMLFile struct {
Stack Stack `yaml:"stack"`
Services map[string]interface{}
}
When you try to unmarshal your file using this struct it will look for a field named "services" in your yaml file(because that's the name of the struct field). And since there is no such field the struct map value is empty.
There are 2 ways to parse a dynamic yaml file whose structure is unknown. If you want the rest of the yaml file, apart from the stack field values, to be parsed into your map then you have to change your yaml file to match your struct. Like this,
#this part is fixed
stack:
platformName: "default"
stack-image-name: "stack-v1"
stack-image-version: "1.5.2"
structure: "flat"
#This part is dynamic and can change
# ===== Apache Zookeeper ========
services:
ZOOKEEPER_enable: false
ZOOKEEPER_volume_map_data: false
ZOOKEEPER_nodes: 1 # either 1 or 3
ZOOKEEPER_navigator_enable: false
# ===== Apache Kafka ========
KAFKA_enable: false
# one of enterprise, community
KAFKA_edition: "community"
KAFKA_volume_map_data: false
KAFKA_broker_nodes: 3
KAFKA_delete_topic_enable: false
KAFKA_auto_create_topics_enable: false
#kafka schema registry
KAFKA_SCHEMA_REGISTRY_enable: false
KAFKA_SCHEMA_REGISTRY_nodes: 1
KAFKA_SCHEMA_REGISTRY_use_zookeeper_election: false
KAFKA_SCHEMA_REGISTRY_replication_factor: 1
#kafka connect
KAFKA_CONNECT_enable: false
KAFKA_CONNECT_nodes: 2
#ksqldb
KAFKA_KSQLDB_enable: false
KAFKA_KSQLDB_edition: "oss"
KAFKA_KSQLDB_nodes: 1
#confluent control center
KAFKA_CCC_enable: false
KAFKA_RESTPROXY_enable: false
KAFKA_MQTTPROXY_enable: false
KAFKA_SCHEMA_REGISTRY_UI_enable: false
KAFKA_TOPICS_UI_enable: false
KAFKA_CONNECT_UI_enable: false
KAFKA_CMAK_enable: false
KAFKA_KAFDROP_enable: false
KAFKA_KADMIN_enable: false
KAFKA_AKHQ_enable: false
KAFKA_BURROW_enable: false
# ===== Hadoop ========
HADOOP_enable: false
HADOOP_datanodes: 2
# ===== Spark ========
SPARK_enable: false
# "hive" or "in-memory"
SPARK_catalog: in-memory
SPARK_workers: 2
SPARK_jars_packages: ""
SPARK_jars_ivySettings: ""
SPARK_driver_extraJavaOptions: ""
SPARK_executor_extraJavaOptions: ""
As you can see I have moved the rest of the file's content under a field called "services", this is so that it can be unmarshalled as you expect into your struct.
OR
The other way is don't use your struct and treat the whole file as dynamic instead of treating part of the file as dynamic. If you follow this method then you can change your code to this,
package main
import (
"fmt"
"io/ioutil"
yaml "gopkg.in/yaml.v2"
)
func main() {
yamlMap := make(map[interface{}]interface{})
ymlContent, err := ioutil.ReadFile("./test.yaml")
if err != nil {
panic(err)
}
err = yaml.Unmarshal(ymlContent, &yamlMap)
fmt.Printf("%v", yamlMap)
}
Upvotes: 1