databricks.getCluster
Explore with Pulumi AI
Note If you have a fully automated setup with workspaces created by databricks.MwsWorkspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.
Example Usage
Retrieve attributes of each SQL warehouses in a workspace
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";
const all = databricks.getClusters({});
const allGetCluster = all.then(all => .reduce((__obj, [__key, __value]) => ({ ...__obj, [__key]: databricks.getCluster({
    clusterId: __value,
}) })));
import pulumi
import pulumi_databricks as databricks
all = databricks.get_clusters()
all_get_cluster = {__key: databricks.get_cluster(cluster_id=__value) for __key, __value in all.ids}
Coming soon!
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;
return await Deployment.RunAsync(() => 
{
    var all = Databricks.GetClusters.Invoke();
    var allGetCluster = ;
});
Coming soon!
Coming soon!
Using getCluster
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>def get_cluster(cluster_id: Optional[str] = None,
                cluster_info: Optional[GetClusterClusterInfo] = None,
                cluster_name: Optional[str] = None,
                id: Optional[str] = None,
                opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
                cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
                cluster_name: Optional[pulumi.Input[str]] = None,
                id: Optional[pulumi.Input[str]] = None,
                opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput> Note: This function is named LookupCluster in the Go SDK.
public static class GetCluster 
{
    public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
    public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
public static Output<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
fn::invoke:
  function: databricks:index/getCluster:getCluster
  arguments:
    # arguments dictionaryThe following arguments are supported:
- ClusterId string
- The id of the cluster.
- ClusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Id string
- cluster ID
- ClusterId string
- The id of the cluster.
- ClusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Id string
- cluster ID
- clusterId String
- The id of the cluster.
- clusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id String
- cluster ID
- clusterId string
- The id of the cluster.
- clusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- clusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id string
- cluster ID
- cluster_id str
- The id of the cluster.
- cluster_info GetCluster Cluster Info 
- block, consisting of following fields:
- cluster_name str
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id str
- cluster ID
- clusterId String
- The id of the cluster.
- clusterInfo Property Map
- block, consisting of following fields:
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id String
- cluster ID
getCluster Result
The following output properties are available:
- ClusterId string
- ClusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- ClusterName string
- Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- ClusterId string
- ClusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- ClusterName string
- Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- clusterId String
- clusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- clusterName String
- Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
- clusterId string
- clusterInfo GetCluster Cluster Info 
- block, consisting of following fields:
- clusterName string
- Cluster name, which doesn’t have to be unique.
- id string
- cluster ID
- cluster_id str
- cluster_info GetCluster Cluster Info 
- block, consisting of following fields:
- cluster_name str
- Cluster name, which doesn’t have to be unique.
- id str
- cluster ID
- clusterId String
- clusterInfo Property Map
- block, consisting of following fields:
- clusterName String
- Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
Supporting Types
GetClusterClusterInfo   
- Autoscale
GetCluster Cluster Info Autoscale 
- AutoterminationMinutes int
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- AwsAttributes GetCluster Cluster Info Aws Attributes 
- AzureAttributes GetCluster Cluster Info Azure Attributes 
- ClusterCores double
- ClusterId string
- The id of the cluster.
- ClusterLog GetConf Cluster Cluster Info Cluster Log Conf 
- ClusterLog GetStatus Cluster Cluster Info Cluster Log Status 
- ClusterMemory intMb 
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- ClusterSource string
- CreatorUser stringName 
- Dictionary<string, string>
- Additional tags for cluster resources.
- DataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- Dictionary<string, string>
- DockerImage GetCluster Cluster Info Docker Image 
- Driver
GetCluster Cluster Info Driver 
- DriverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- DriverNode stringType Id 
- The node type of the Spark driver.
- EnableElastic boolDisk 
- Use autoscaling local storage.
- EnableLocal boolDisk Encryption 
- Enable local disk encryption.
- Executors
List<GetCluster Cluster Info Executor> 
- GcpAttributes GetCluster Cluster Info Gcp Attributes 
- InitScripts List<GetCluster Cluster Info Init Script> 
- InstancePool stringId 
- The pool of idle instances the cluster is attached to.
- IsSingle boolNode 
- JdbcPort int
- Kind string
- LastRestarted intTime 
- LastState intLoss Time 
- NodeType stringId 
- Any supported databricks.getNodeType id.
- NumWorkers int
- PolicyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- RuntimeEngine string
- The type of runtime of the cluster
- SingleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- SparkConf Dictionary<string, string>
- Map with key-value pairs to fine-tune Spark clusters.
- SparkContext intId 
- SparkEnv Dictionary<string, string>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- SparkVersion string
- Runtime version of the cluster.
- Spec
GetCluster Cluster Info Spec 
- SshPublic List<string>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- StartTime int
- State string
- StateMessage string
- TerminatedTime int
- TerminationReason GetCluster Cluster Info Termination Reason 
- UseMl boolRuntime 
- WorkloadType GetCluster Cluster Info Workload Type 
- Autoscale
GetCluster Cluster Info Autoscale 
- AutoterminationMinutes int
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- AwsAttributes GetCluster Cluster Info Aws Attributes 
- AzureAttributes GetCluster Cluster Info Azure Attributes 
- ClusterCores float64
- ClusterId string
- The id of the cluster.
- ClusterLog GetConf Cluster Cluster Info Cluster Log Conf 
- ClusterLog GetStatus Cluster Cluster Info Cluster Log Status 
- ClusterMemory intMb 
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- ClusterSource string
- CreatorUser stringName 
- map[string]string
- Additional tags for cluster resources.
- DataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- map[string]string
- DockerImage GetCluster Cluster Info Docker Image 
- Driver
GetCluster Cluster Info Driver 
- DriverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- DriverNode stringType Id 
- The node type of the Spark driver.
- EnableElastic boolDisk 
- Use autoscaling local storage.
- EnableLocal boolDisk Encryption 
- Enable local disk encryption.
- Executors
[]GetCluster Cluster Info Executor 
- GcpAttributes GetCluster Cluster Info Gcp Attributes 
- InitScripts []GetCluster Cluster Info Init Script 
- InstancePool stringId 
- The pool of idle instances the cluster is attached to.
- IsSingle boolNode 
- JdbcPort int
- Kind string
- LastRestarted intTime 
- LastState intLoss Time 
- NodeType stringId 
- Any supported databricks.getNodeType id.
- NumWorkers int
- PolicyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- RuntimeEngine string
- The type of runtime of the cluster
- SingleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- SparkConf map[string]string
- Map with key-value pairs to fine-tune Spark clusters.
- SparkContext intId 
- SparkEnv map[string]stringVars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- SparkVersion string
- Runtime version of the cluster.
- Spec
GetCluster Cluster Info Spec 
- SshPublic []stringKeys 
- SSH public key contents that will be added to each Spark node in this cluster.
- StartTime int
- State string
- StateMessage string
- TerminatedTime int
- TerminationReason GetCluster Cluster Info Termination Reason 
- UseMl boolRuntime 
- WorkloadType GetCluster Cluster Info Workload Type 
- autoscale
GetCluster Cluster Info Autoscale 
- autoterminationMinutes Integer
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- awsAttributes GetCluster Cluster Info Aws Attributes 
- azureAttributes GetCluster Cluster Info Azure Attributes 
- clusterCores Double
- clusterId String
- The id of the cluster.
- clusterLog GetConf Cluster Cluster Info Cluster Log Conf 
- clusterLog GetStatus Cluster Cluster Info Cluster Log Status 
- clusterMemory IntegerMb 
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- clusterSource String
- creatorUser StringName 
- Map<String,String>
- Additional tags for cluster resources.
- dataSecurity StringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- Map<String,String>
- dockerImage GetCluster Cluster Info Docker Image 
- driver
GetCluster Cluster Info Driver 
- driverInstance StringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode StringType Id 
- The node type of the Spark driver.
- enableElastic BooleanDisk 
- Use autoscaling local storage.
- enableLocal BooleanDisk Encryption 
- Enable local disk encryption.
- executors
List<GetCluster Cluster Info Executor> 
- gcpAttributes GetCluster Cluster Info Gcp Attributes 
- initScripts List<GetCluster Cluster Info Init Script> 
- instancePool StringId 
- The pool of idle instances the cluster is attached to.
- isSingle BooleanNode 
- jdbcPort Integer
- kind String
- lastRestarted IntegerTime 
- lastState IntegerLoss Time 
- nodeType StringId 
- Any supported databricks.getNodeType id.
- numWorkers Integer
- policyId String
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine String
- The type of runtime of the cluster
- singleUser StringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf Map<String,String>
- Map with key-value pairs to fine-tune Spark clusters.
- sparkContext IntegerId 
- sparkEnv Map<String,String>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sparkVersion String
- Runtime version of the cluster.
- spec
GetCluster Cluster Info Spec 
- sshPublic List<String>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- startTime Integer
- state String
- stateMessage String
- terminatedTime Integer
- terminationReason GetCluster Cluster Info Termination Reason 
- useMl BooleanRuntime 
- workloadType GetCluster Cluster Info Workload Type 
- autoscale
GetCluster Cluster Info Autoscale 
- autoterminationMinutes number
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- awsAttributes GetCluster Cluster Info Aws Attributes 
- azureAttributes GetCluster Cluster Info Azure Attributes 
- clusterCores number
- clusterId string
- The id of the cluster.
- clusterLog GetConf Cluster Cluster Info Cluster Log Conf 
- clusterLog GetStatus Cluster Cluster Info Cluster Log Status 
- clusterMemory numberMb 
- clusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- clusterSource string
- creatorUser stringName 
- {[key: string]: string}
- Additional tags for cluster resources.
- dataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- {[key: string]: string}
- dockerImage GetCluster Cluster Info Docker Image 
- driver
GetCluster Cluster Info Driver 
- driverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode stringType Id 
- The node type of the Spark driver.
- enableElastic booleanDisk 
- Use autoscaling local storage.
- enableLocal booleanDisk Encryption 
- Enable local disk encryption.
- executors
GetCluster Cluster Info Executor[] 
- gcpAttributes GetCluster Cluster Info Gcp Attributes 
- initScripts GetCluster Cluster Info Init Script[] 
- instancePool stringId 
- The pool of idle instances the cluster is attached to.
- isSingle booleanNode 
- jdbcPort number
- kind string
- lastRestarted numberTime 
- lastState numberLoss Time 
- nodeType stringId 
- Any supported databricks.getNodeType id.
- numWorkers number
- policyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine string
- The type of runtime of the cluster
- singleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf {[key: string]: string}
- Map with key-value pairs to fine-tune Spark clusters.
- sparkContext numberId 
- sparkEnv {[key: string]: string}Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sparkVersion string
- Runtime version of the cluster.
- spec
GetCluster Cluster Info Spec 
- sshPublic string[]Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- startTime number
- state string
- stateMessage string
- terminatedTime number
- terminationReason GetCluster Cluster Info Termination Reason 
- useMl booleanRuntime 
- workloadType GetCluster Cluster Info Workload Type 
- autoscale
GetCluster Cluster Info Autoscale 
- autotermination_minutes int
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws_attributes GetCluster Cluster Info Aws Attributes 
- azure_attributes GetCluster Cluster Info Azure Attributes 
- cluster_cores float
- cluster_id str
- The id of the cluster.
- cluster_log_ Getconf Cluster Cluster Info Cluster Log Conf 
- cluster_log_ Getstatus Cluster Cluster Info Cluster Log Status 
- cluster_memory_ intmb 
- cluster_name str
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- cluster_source str
- creator_user_ strname 
- Mapping[str, str]
- Additional tags for cluster resources.
- data_security_ strmode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- Mapping[str, str]
- docker_image GetCluster Cluster Info Docker Image 
- driver
GetCluster Cluster Info Driver 
- driver_instance_ strpool_ id 
- similar to instance_pool_id, but for driver node.
- driver_node_ strtype_ id 
- The node type of the Spark driver.
- enable_elastic_ booldisk 
- Use autoscaling local storage.
- enable_local_ booldisk_ encryption 
- Enable local disk encryption.
- executors
Sequence[GetCluster Cluster Info Executor] 
- gcp_attributes GetCluster Cluster Info Gcp Attributes 
- init_scripts Sequence[GetCluster Cluster Info Init Script] 
- instance_pool_ strid 
- The pool of idle instances the cluster is attached to.
- is_single_ boolnode 
- jdbc_port int
- kind str
- last_restarted_ inttime 
- last_state_ intloss_ time 
- node_type_ strid 
- Any supported databricks.getNodeType id.
- num_workers int
- policy_id str
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime_engine str
- The type of runtime of the cluster
- single_user_ strname 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_conf Mapping[str, str]
- Map with key-value pairs to fine-tune Spark clusters.
- spark_context_ intid 
- spark_env_ Mapping[str, str]vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark_version str
- Runtime version of the cluster.
- spec
GetCluster Cluster Info Spec 
- ssh_public_ Sequence[str]keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- start_time int
- state str
- state_message str
- terminated_time int
- termination_reason GetCluster Cluster Info Termination Reason 
- use_ml_ boolruntime 
- workload_type GetCluster Cluster Info Workload Type 
- autoscale Property Map
- autoterminationMinutes Number
- Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- awsAttributes Property Map
- azureAttributes Property Map
- clusterCores Number
- clusterId String
- The id of the cluster.
- clusterLog Property MapConf 
- clusterLog Property MapStatus 
- clusterMemory NumberMb 
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- clusterSource String
- creatorUser StringName 
- Map<String>
- Additional tags for cluster resources.
- dataSecurity StringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- Map<String>
- dockerImage Property Map
- driver Property Map
- driverInstance StringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode StringType Id 
- The node type of the Spark driver.
- enableElastic BooleanDisk 
- Use autoscaling local storage.
- enableLocal BooleanDisk Encryption 
- Enable local disk encryption.
- executors List<Property Map>
- gcpAttributes Property Map
- initScripts List<Property Map>
- instancePool StringId 
- The pool of idle instances the cluster is attached to.
- isSingle BooleanNode 
- jdbcPort Number
- kind String
- lastRestarted NumberTime 
- lastState NumberLoss Time 
- nodeType StringId 
- Any supported databricks.getNodeType id.
- numWorkers Number
- policyId String
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine String
- The type of runtime of the cluster
- singleUser StringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf Map<String>
- Map with key-value pairs to fine-tune Spark clusters.
- sparkContext NumberId 
- sparkEnv Map<String>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sparkVersion String
- Runtime version of the cluster.
- spec Property Map
- sshPublic List<String>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- startTime Number
- state String
- stateMessage String
- terminatedTime Number
- terminationReason Property Map
- useMl BooleanRuntime 
- workloadType Property Map
GetClusterClusterInfoAutoscale    
- MaxWorkers int
- MinWorkers int
- MaxWorkers int
- MinWorkers int
- maxWorkers Integer
- minWorkers Integer
- maxWorkers number
- minWorkers number
- max_workers int
- min_workers int
- maxWorkers Number
- minWorkers Number
GetClusterClusterInfoAwsAttributes     
- Availability string
- EbsVolume intCount 
- EbsVolume intIops 
- EbsVolume intSize 
- EbsVolume intThroughput 
- EbsVolume stringType 
- FirstOn intDemand 
- InstanceProfile stringArn 
- SpotBid intPrice Percent 
- ZoneId string
- Availability string
- EbsVolume intCount 
- EbsVolume intIops 
- EbsVolume intSize 
- EbsVolume intThroughput 
- EbsVolume stringType 
- FirstOn intDemand 
- InstanceProfile stringArn 
- SpotBid intPrice Percent 
- ZoneId string
- availability String
- ebsVolume IntegerCount 
- ebsVolume IntegerIops 
- ebsVolume IntegerSize 
- ebsVolume IntegerThroughput 
- ebsVolume StringType 
- firstOn IntegerDemand 
- instanceProfile StringArn 
- spotBid IntegerPrice Percent 
- zoneId String
- availability string
- ebsVolume numberCount 
- ebsVolume numberIops 
- ebsVolume numberSize 
- ebsVolume numberThroughput 
- ebsVolume stringType 
- firstOn numberDemand 
- instanceProfile stringArn 
- spotBid numberPrice Percent 
- zoneId string
- availability str
- ebs_volume_ intcount 
- ebs_volume_ intiops 
- ebs_volume_ intsize 
- ebs_volume_ intthroughput 
- ebs_volume_ strtype 
- first_on_ intdemand 
- instance_profile_ strarn 
- spot_bid_ intprice_ percent 
- zone_id str
- availability String
- ebsVolume NumberCount 
- ebsVolume NumberIops 
- ebsVolume NumberSize 
- ebsVolume NumberThroughput 
- ebsVolume StringType 
- firstOn NumberDemand 
- instanceProfile StringArn 
- spotBid NumberPrice Percent 
- zoneId String
GetClusterClusterInfoAzureAttributes     
- availability String
- firstOn NumberDemand 
- logAnalytics Property MapInfo 
- spotBid NumberMax Price 
GetClusterClusterInfoAzureAttributesLogAnalyticsInfo        
- LogAnalytics stringPrimary Key 
- LogAnalytics stringWorkspace Id 
- LogAnalytics stringPrimary Key 
- LogAnalytics stringWorkspace Id 
- logAnalytics StringPrimary Key 
- logAnalytics StringWorkspace Id 
- logAnalytics stringPrimary Key 
- logAnalytics stringWorkspace Id 
- logAnalytics StringPrimary Key 
- logAnalytics StringWorkspace Id 
GetClusterClusterInfoClusterLogConf      
GetClusterClusterInfoClusterLogConfDbfs       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogConfS3       
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
- destination string
- cannedAcl string
- enableEncryption boolean
- encryptionType string
- endpoint string
- kmsKey string
- region string
- destination str
- canned_acl str
- enable_encryption bool
- encryption_type str
- endpoint str
- kms_key str
- region str
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
GetClusterClusterInfoClusterLogConfVolumes       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogStatus      
- LastAttempted int
- LastException string
- LastAttempted int
- LastException string
- lastAttempted Integer
- lastException String
- lastAttempted number
- lastException string
- last_attempted int
- last_exception str
- lastAttempted Number
- lastException String
GetClusterClusterInfoDockerImage     
- basicAuth Property Map
- url String
GetClusterClusterInfoDockerImageBasicAuth       
GetClusterClusterInfoDriver    
- HostPrivate stringIp 
- InstanceId string
- NodeAws GetAttributes Cluster Cluster Info Driver Node Aws Attributes 
- NodeId string
- PrivateIp string
- PublicDns string
- StartTimestamp int
- HostPrivate stringIp 
- InstanceId string
- NodeAws GetAttributes Cluster Cluster Info Driver Node Aws Attributes 
- NodeId string
- PrivateIp string
- PublicDns string
- StartTimestamp int
- hostPrivate StringIp 
- instanceId String
- nodeAws GetAttributes Cluster Cluster Info Driver Node Aws Attributes 
- nodeId String
- privateIp String
- publicDns String
- startTimestamp Integer
- hostPrivate stringIp 
- instanceId string
- nodeAws GetAttributes Cluster Cluster Info Driver Node Aws Attributes 
- nodeId string
- privateIp string
- publicDns string
- startTimestamp number
- hostPrivate StringIp 
- instanceId String
- nodeAws Property MapAttributes 
- nodeId String
- privateIp String
- publicDns String
- startTimestamp Number
GetClusterClusterInfoDriverNodeAwsAttributes       
- IsSpot bool
- IsSpot bool
- isSpot Boolean
- isSpot boolean
- is_spot bool
- isSpot Boolean
GetClusterClusterInfoExecutor    
- HostPrivate stringIp 
- InstanceId string
- NodeAws GetAttributes Cluster Cluster Info Executor Node Aws Attributes 
- NodeId string
- PrivateIp string
- PublicDns string
- StartTimestamp int
- HostPrivate stringIp 
- InstanceId string
- NodeAws GetAttributes Cluster Cluster Info Executor Node Aws Attributes 
- NodeId string
- PrivateIp string
- PublicDns string
- StartTimestamp int
- hostPrivate StringIp 
- instanceId String
- nodeAws GetAttributes Cluster Cluster Info Executor Node Aws Attributes 
- nodeId String
- privateIp String
- publicDns String
- startTimestamp Integer
- hostPrivate stringIp 
- instanceId string
- nodeAws GetAttributes Cluster Cluster Info Executor Node Aws Attributes 
- nodeId string
- privateIp string
- publicDns string
- startTimestamp number
- hostPrivate StringIp 
- instanceId String
- nodeAws Property MapAttributes 
- nodeId String
- privateIp String
- publicDns String
- startTimestamp Number
GetClusterClusterInfoExecutorNodeAwsAttributes       
- IsSpot bool
- IsSpot bool
- isSpot Boolean
- isSpot boolean
- is_spot bool
- isSpot Boolean
GetClusterClusterInfoGcpAttributes     
- Availability string
- BootDisk intSize 
- GoogleService stringAccount 
- LocalSsd intCount 
- UsePreemptible boolExecutors 
- ZoneId string
- Availability string
- BootDisk intSize 
- GoogleService stringAccount 
- LocalSsd intCount 
- UsePreemptible boolExecutors 
- ZoneId string
- availability String
- bootDisk IntegerSize 
- googleService StringAccount 
- localSsd IntegerCount 
- usePreemptible BooleanExecutors 
- zoneId String
- availability string
- bootDisk numberSize 
- googleService stringAccount 
- localSsd numberCount 
- usePreemptible booleanExecutors 
- zoneId string
- availability str
- boot_disk_ intsize 
- google_service_ straccount 
- local_ssd_ intcount 
- use_preemptible_ boolexecutors 
- zone_id str
- availability String
- bootDisk NumberSize 
- googleService StringAccount 
- localSsd NumberCount 
- usePreemptible BooleanExecutors 
- zoneId String
GetClusterClusterInfoInitScript     
GetClusterClusterInfoInitScriptAbfss      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptDbfs      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptFile      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptGcs      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptS3      
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
- destination string
- cannedAcl string
- enableEncryption boolean
- encryptionType string
- endpoint string
- kmsKey string
- region string
- destination str
- canned_acl str
- enable_encryption bool
- encryption_type str
- endpoint str
- kms_key str
- region str
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
GetClusterClusterInfoInitScriptVolumes      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptWorkspace      
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpec    
- ClusterId string
- The id of the cluster.
- DriverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- DriverNode stringType Id 
- The node type of the Spark driver.
- EnableElastic boolDisk 
- Use autoscaling local storage.
- EnableLocal boolDisk Encryption 
- Enable local disk encryption.
- NodeType stringId 
- Any supported databricks.getNodeType id.
- SparkVersion string
- Runtime version of the cluster.
- ApplyPolicy boolDefault Values 
- Autoscale
GetCluster Cluster Info Spec Autoscale 
- AwsAttributes GetCluster Cluster Info Spec Aws Attributes 
- AzureAttributes GetCluster Cluster Info Spec Azure Attributes 
- ClusterLog GetConf Cluster Cluster Info Spec Cluster Log Conf 
- ClusterMount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> 
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Dictionary<string, string>
- Additional tags for cluster resources.
- DataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- DockerImage GetCluster Cluster Info Spec Docker Image 
- GcpAttributes GetCluster Cluster Info Spec Gcp Attributes 
- IdempotencyToken string
- An optional token to guarantee the idempotency of cluster creation requests.
- InitScripts List<GetCluster Cluster Info Spec Init Script> 
- InstancePool stringId 
- The pool of idle instances the cluster is attached to.
- IsSingle boolNode 
- Kind string
- Libraries
List<GetCluster Cluster Info Spec Library> 
- NumWorkers int
- PolicyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- RuntimeEngine string
- The type of runtime of the cluster
- SingleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- SparkConf Dictionary<string, string>
- Map with key-value pairs to fine-tune Spark clusters.
- SparkEnv Dictionary<string, string>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- SshPublic List<string>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- UseMl boolRuntime 
- WorkloadType GetCluster Cluster Info Spec Workload Type 
- ClusterId string
- The id of the cluster.
- DriverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- DriverNode stringType Id 
- The node type of the Spark driver.
- EnableElastic boolDisk 
- Use autoscaling local storage.
- EnableLocal boolDisk Encryption 
- Enable local disk encryption.
- NodeType stringId 
- Any supported databricks.getNodeType id.
- SparkVersion string
- Runtime version of the cluster.
- ApplyPolicy boolDefault Values 
- Autoscale
GetCluster Cluster Info Spec Autoscale 
- AwsAttributes GetCluster Cluster Info Spec Aws Attributes 
- AzureAttributes GetCluster Cluster Info Spec Azure Attributes 
- ClusterLog GetConf Cluster Cluster Info Spec Cluster Log Conf 
- ClusterMount []GetInfos Cluster Cluster Info Spec Cluster Mount Info 
- ClusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- map[string]string
- Additional tags for cluster resources.
- DataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- DockerImage GetCluster Cluster Info Spec Docker Image 
- GcpAttributes GetCluster Cluster Info Spec Gcp Attributes 
- IdempotencyToken string
- An optional token to guarantee the idempotency of cluster creation requests.
- InitScripts []GetCluster Cluster Info Spec Init Script 
- InstancePool stringId 
- The pool of idle instances the cluster is attached to.
- IsSingle boolNode 
- Kind string
- Libraries
[]GetCluster Cluster Info Spec Library 
- NumWorkers int
- PolicyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- RuntimeEngine string
- The type of runtime of the cluster
- SingleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- SparkConf map[string]string
- Map with key-value pairs to fine-tune Spark clusters.
- SparkEnv map[string]stringVars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- SshPublic []stringKeys 
- SSH public key contents that will be added to each Spark node in this cluster.
- UseMl boolRuntime 
- WorkloadType GetCluster Cluster Info Spec Workload Type 
- clusterId String
- The id of the cluster.
- driverInstance StringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode StringType Id 
- The node type of the Spark driver.
- enableElastic BooleanDisk 
- Use autoscaling local storage.
- enableLocal BooleanDisk Encryption 
- Enable local disk encryption.
- nodeType StringId 
- Any supported databricks.getNodeType id.
- sparkVersion String
- Runtime version of the cluster.
- applyPolicy BooleanDefault Values 
- autoscale
GetCluster Cluster Info Spec Autoscale 
- awsAttributes GetCluster Cluster Info Spec Aws Attributes 
- azureAttributes GetCluster Cluster Info Spec Azure Attributes 
- clusterLog GetConf Cluster Cluster Info Spec Cluster Log Conf 
- clusterMount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> 
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Map<String,String>
- Additional tags for cluster resources.
- dataSecurity StringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- dockerImage GetCluster Cluster Info Spec Docker Image 
- gcpAttributes GetCluster Cluster Info Spec Gcp Attributes 
- idempotencyToken String
- An optional token to guarantee the idempotency of cluster creation requests.
- initScripts List<GetCluster Cluster Info Spec Init Script> 
- instancePool StringId 
- The pool of idle instances the cluster is attached to.
- isSingle BooleanNode 
- kind String
- libraries
List<GetCluster Cluster Info Spec Library> 
- numWorkers Integer
- policyId String
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine String
- The type of runtime of the cluster
- singleUser StringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf Map<String,String>
- Map with key-value pairs to fine-tune Spark clusters.
- sparkEnv Map<String,String>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sshPublic List<String>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- useMl BooleanRuntime 
- workloadType GetCluster Cluster Info Spec Workload Type 
- clusterId string
- The id of the cluster.
- driverInstance stringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode stringType Id 
- The node type of the Spark driver.
- enableElastic booleanDisk 
- Use autoscaling local storage.
- enableLocal booleanDisk Encryption 
- Enable local disk encryption.
- nodeType stringId 
- Any supported databricks.getNodeType id.
- sparkVersion string
- Runtime version of the cluster.
- applyPolicy booleanDefault Values 
- autoscale
GetCluster Cluster Info Spec Autoscale 
- awsAttributes GetCluster Cluster Info Spec Aws Attributes 
- azureAttributes GetCluster Cluster Info Spec Azure Attributes 
- clusterLog GetConf Cluster Cluster Info Spec Cluster Log Conf 
- clusterMount GetInfos Cluster Cluster Info Spec Cluster Mount Info[] 
- clusterName string
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- {[key: string]: string}
- Additional tags for cluster resources.
- dataSecurity stringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- dockerImage GetCluster Cluster Info Spec Docker Image 
- gcpAttributes GetCluster Cluster Info Spec Gcp Attributes 
- idempotencyToken string
- An optional token to guarantee the idempotency of cluster creation requests.
- initScripts GetCluster Cluster Info Spec Init Script[] 
- instancePool stringId 
- The pool of idle instances the cluster is attached to.
- isSingle booleanNode 
- kind string
- libraries
GetCluster Cluster Info Spec Library[] 
- numWorkers number
- policyId string
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine string
- The type of runtime of the cluster
- singleUser stringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf {[key: string]: string}
- Map with key-value pairs to fine-tune Spark clusters.
- sparkEnv {[key: string]: string}Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sshPublic string[]Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- useMl booleanRuntime 
- workloadType GetCluster Cluster Info Spec Workload Type 
- cluster_id str
- The id of the cluster.
- driver_instance_ strpool_ id 
- similar to instance_pool_id, but for driver node.
- driver_node_ strtype_ id 
- The node type of the Spark driver.
- enable_elastic_ booldisk 
- Use autoscaling local storage.
- enable_local_ booldisk_ encryption 
- Enable local disk encryption.
- node_type_ strid 
- Any supported databricks.getNodeType id.
- spark_version str
- Runtime version of the cluster.
- apply_policy_ booldefault_ values 
- autoscale
GetCluster Cluster Info Spec Autoscale 
- aws_attributes GetCluster Cluster Info Spec Aws Attributes 
- azure_attributes GetCluster Cluster Info Spec Azure Attributes 
- cluster_log_ Getconf Cluster Cluster Info Spec Cluster Log Conf 
- cluster_mount_ Sequence[Getinfos Cluster Cluster Info Spec Cluster Mount Info] 
- cluster_name str
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Mapping[str, str]
- Additional tags for cluster resources.
- data_security_ strmode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- docker_image GetCluster Cluster Info Spec Docker Image 
- gcp_attributes GetCluster Cluster Info Spec Gcp Attributes 
- idempotency_token str
- An optional token to guarantee the idempotency of cluster creation requests.
- init_scripts Sequence[GetCluster Cluster Info Spec Init Script] 
- instance_pool_ strid 
- The pool of idle instances the cluster is attached to.
- is_single_ boolnode 
- kind str
- libraries
Sequence[GetCluster Cluster Info Spec Library] 
- num_workers int
- policy_id str
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime_engine str
- The type of runtime of the cluster
- single_user_ strname 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_conf Mapping[str, str]
- Map with key-value pairs to fine-tune Spark clusters.
- spark_env_ Mapping[str, str]vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh_public_ Sequence[str]keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- use_ml_ boolruntime 
- workload_type GetCluster Cluster Info Spec Workload Type 
- clusterId String
- The id of the cluster.
- driverInstance StringPool Id 
- similar to instance_pool_id, but for driver node.
- driverNode StringType Id 
- The node type of the Spark driver.
- enableElastic BooleanDisk 
- Use autoscaling local storage.
- enableLocal BooleanDisk Encryption 
- Enable local disk encryption.
- nodeType StringId 
- Any supported databricks.getNodeType id.
- sparkVersion String
- Runtime version of the cluster.
- applyPolicy BooleanDefault Values 
- autoscale Property Map
- awsAttributes Property Map
- azureAttributes Property Map
- clusterLog Property MapConf 
- clusterMount List<Property Map>Infos 
- clusterName String
- The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Map<String>
- Additional tags for cluster resources.
- dataSecurity StringMode 
- Security features of the cluster. Unity Catalog requires SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled.
- dockerImage Property Map
- gcpAttributes Property Map
- idempotencyToken String
- An optional token to guarantee the idempotency of cluster creation requests.
- initScripts List<Property Map>
- instancePool StringId 
- The pool of idle instances the cluster is attached to.
- isSingle BooleanNode 
- kind String
- libraries List<Property Map>
- numWorkers Number
- policyId String
- Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtimeEngine String
- The type of runtime of the cluster
- singleUser StringName 
- The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- sparkConf Map<String>
- Map with key-value pairs to fine-tune Spark clusters.
- sparkEnv Map<String>Vars 
- Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- sshPublic List<String>Keys 
- SSH public key contents that will be added to each Spark node in this cluster.
- useMl BooleanRuntime 
- workloadType Property Map
GetClusterClusterInfoSpecAutoscale     
- MaxWorkers int
- MinWorkers int
- MaxWorkers int
- MinWorkers int
- maxWorkers Integer
- minWorkers Integer
- maxWorkers number
- minWorkers number
- max_workers int
- min_workers int
- maxWorkers Number
- minWorkers Number
GetClusterClusterInfoSpecAwsAttributes      
- Availability string
- EbsVolume intCount 
- EbsVolume intIops 
- EbsVolume intSize 
- EbsVolume intThroughput 
- EbsVolume stringType 
- FirstOn intDemand 
- InstanceProfile stringArn 
- SpotBid intPrice Percent 
- ZoneId string
- Availability string
- EbsVolume intCount 
- EbsVolume intIops 
- EbsVolume intSize 
- EbsVolume intThroughput 
- EbsVolume stringType 
- FirstOn intDemand 
- InstanceProfile stringArn 
- SpotBid intPrice Percent 
- ZoneId string
- availability String
- ebsVolume IntegerCount 
- ebsVolume IntegerIops 
- ebsVolume IntegerSize 
- ebsVolume IntegerThroughput 
- ebsVolume StringType 
- firstOn IntegerDemand 
- instanceProfile StringArn 
- spotBid IntegerPrice Percent 
- zoneId String
- availability string
- ebsVolume numberCount 
- ebsVolume numberIops 
- ebsVolume numberSize 
- ebsVolume numberThroughput 
- ebsVolume stringType 
- firstOn numberDemand 
- instanceProfile stringArn 
- spotBid numberPrice Percent 
- zoneId string
- availability str
- ebs_volume_ intcount 
- ebs_volume_ intiops 
- ebs_volume_ intsize 
- ebs_volume_ intthroughput 
- ebs_volume_ strtype 
- first_on_ intdemand 
- instance_profile_ strarn 
- spot_bid_ intprice_ percent 
- zone_id str
- availability String
- ebsVolume NumberCount 
- ebsVolume NumberIops 
- ebsVolume NumberSize 
- ebsVolume NumberThroughput 
- ebsVolume StringType 
- firstOn NumberDemand 
- instanceProfile StringArn 
- spotBid NumberPrice Percent 
- zoneId String
GetClusterClusterInfoSpecAzureAttributes      
- availability String
- firstOn NumberDemand 
- logAnalytics Property MapInfo 
- spotBid NumberMax Price 
GetClusterClusterInfoSpecAzureAttributesLogAnalyticsInfo         
- LogAnalytics stringPrimary Key 
- LogAnalytics stringWorkspace Id 
- LogAnalytics stringPrimary Key 
- LogAnalytics stringWorkspace Id 
- logAnalytics StringPrimary Key 
- logAnalytics StringWorkspace Id 
- logAnalytics stringPrimary Key 
- logAnalytics stringWorkspace Id 
- logAnalytics StringPrimary Key 
- logAnalytics StringWorkspace Id 
GetClusterClusterInfoSpecClusterLogConf       
GetClusterClusterInfoSpecClusterLogConfDbfs        
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecClusterLogConfS3        
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
- destination string
- cannedAcl string
- enableEncryption boolean
- encryptionType string
- endpoint string
- kmsKey string
- region string
- destination str
- canned_acl str
- enable_encryption bool
- encryption_type str
- endpoint str
- kms_key str
- region str
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
GetClusterClusterInfoSpecClusterLogConfVolumes        
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecClusterMountInfo       
GetClusterClusterInfoSpecClusterMountInfoNetworkFilesystemInfo          
- ServerAddress string
- MountOptions string
- ServerAddress string
- MountOptions string
- serverAddress String
- mountOptions String
- serverAddress string
- mountOptions string
- server_address str
- mount_options str
- serverAddress String
- mountOptions String
GetClusterClusterInfoSpecDockerImage      
- url String
- basicAuth Property Map
GetClusterClusterInfoSpecDockerImageBasicAuth        
GetClusterClusterInfoSpecGcpAttributes      
- Availability string
- BootDisk intSize 
- GoogleService stringAccount 
- LocalSsd intCount 
- UsePreemptible boolExecutors 
- ZoneId string
- Availability string
- BootDisk intSize 
- GoogleService stringAccount 
- LocalSsd intCount 
- UsePreemptible boolExecutors 
- ZoneId string
- availability String
- bootDisk IntegerSize 
- googleService StringAccount 
- localSsd IntegerCount 
- usePreemptible BooleanExecutors 
- zoneId String
- availability string
- bootDisk numberSize 
- googleService stringAccount 
- localSsd numberCount 
- usePreemptible booleanExecutors 
- zoneId string
- availability str
- boot_disk_ intsize 
- google_service_ straccount 
- local_ssd_ intcount 
- use_preemptible_ boolexecutors 
- zone_id str
- availability String
- bootDisk NumberSize 
- googleService StringAccount 
- localSsd NumberCount 
- usePreemptible BooleanExecutors 
- zoneId String
GetClusterClusterInfoSpecInitScript      
- Abfss
GetCluster Cluster Info Spec Init Script Abfss 
- Dbfs
GetCluster Cluster Info Spec Init Script Dbfs 
- File
GetCluster Cluster Info Spec Init Script File 
- Gcs
GetCluster Cluster Info Spec Init Script Gcs 
- S3
GetCluster Cluster Info Spec Init Script S3 
- Volumes
GetCluster Cluster Info Spec Init Script Volumes 
- Workspace
GetCluster Cluster Info Spec Init Script Workspace 
- Abfss
GetCluster Cluster Info Spec Init Script Abfss 
- Dbfs
GetCluster Cluster Info Spec Init Script Dbfs 
- File
GetCluster Cluster Info Spec Init Script File 
- Gcs
GetCluster Cluster Info Spec Init Script Gcs 
- S3
GetCluster Cluster Info Spec Init Script S3 
- Volumes
GetCluster Cluster Info Spec Init Script Volumes 
- Workspace
GetCluster Cluster Info Spec Init Script Workspace 
- abfss
GetCluster Cluster Info Spec Init Script Abfss 
- dbfs
GetCluster Cluster Info Spec Init Script Dbfs 
- file
GetCluster Cluster Info Spec Init Script File 
- gcs
GetCluster Cluster Info Spec Init Script Gcs 
- s3
GetCluster Cluster Info Spec Init Script S3 
- volumes
GetCluster Cluster Info Spec Init Script Volumes 
- workspace
GetCluster Cluster Info Spec Init Script Workspace 
- abfss
GetCluster Cluster Info Spec Init Script Abfss 
- dbfs
GetCluster Cluster Info Spec Init Script Dbfs 
- file
GetCluster Cluster Info Spec Init Script File 
- gcs
GetCluster Cluster Info Spec Init Script Gcs 
- s3
GetCluster Cluster Info Spec Init Script S3 
- volumes
GetCluster Cluster Info Spec Init Script Volumes 
- workspace
GetCluster Cluster Info Spec Init Script Workspace 
- abfss
GetCluster Cluster Info Spec Init Script Abfss 
- dbfs
GetCluster Cluster Info Spec Init Script Dbfs 
- file
GetCluster Cluster Info Spec Init Script File 
- gcs
GetCluster Cluster Info Spec Init Script Gcs 
- s3
GetCluster Cluster Info Spec Init Script S3 
- volumes
GetCluster Cluster Info Spec Init Script Volumes 
- workspace
GetCluster Cluster Info Spec Init Script Workspace 
GetClusterClusterInfoSpecInitScriptAbfss       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptDbfs       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptFile       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptGcs       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptS3       
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- Destination string
- CannedAcl string
- EnableEncryption bool
- EncryptionType string
- Endpoint string
- KmsKey string
- Region string
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
- destination string
- cannedAcl string
- enableEncryption boolean
- encryptionType string
- endpoint string
- kmsKey string
- region string
- destination str
- canned_acl str
- enable_encryption bool
- encryption_type str
- endpoint str
- kms_key str
- region str
- destination String
- cannedAcl String
- enableEncryption Boolean
- encryptionType String
- endpoint String
- kmsKey String
- region String
GetClusterClusterInfoSpecInitScriptVolumes       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptWorkspace       
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecLibrary     
- cran Property Map
- egg String
- jar String
- maven Property Map
- pypi Property Map
- requirements String
- whl String
GetClusterClusterInfoSpecLibraryCran      
GetClusterClusterInfoSpecLibraryMaven      
- Coordinates string
- Exclusions List<string>
- Repo string
- Coordinates string
- Exclusions []string
- Repo string
- coordinates String
- exclusions List<String>
- repo String
- coordinates string
- exclusions string[]
- repo string
- coordinates str
- exclusions Sequence[str]
- repo str
- coordinates String
- exclusions List<String>
- repo String
GetClusterClusterInfoSpecLibraryPypi      
GetClusterClusterInfoSpecWorkloadType      
GetClusterClusterInfoSpecWorkloadTypeClients       
GetClusterClusterInfoTerminationReason     
- Code string
- Parameters Dictionary<string, string>
- Type string
- Code string
- Parameters map[string]string
- Type string
- code String
- parameters Map<String,String>
- type String
- code string
- parameters {[key: string]: string}
- type string
- code str
- parameters Mapping[str, str]
- type str
- code String
- parameters Map<String>
- type String
GetClusterClusterInfoWorkloadType     
GetClusterClusterInfoWorkloadTypeClients      
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the databricksTerraform Provider.