Forbidden 403 when trying to create a delta table in oneLake with notebook using sql

Siva Reddy 40 Reputation points
2024-08-13T06:46:10.3733333+00:00

Created a spark dataframe from the csv files and also created a view TableTempView on top of this dataframe.
When I try to create a table in the lakehouse using below code

%%sql 
create table if not EXISTS Bronze_Sales21
using DELTA
as 
select * from TableTempView

Seeing the below error**
Request failed: HTTP/1.1 403 Forbidden** com.microsoft.fabric.spark.metadata.Helpers$.executeRequest(Helpers.scala:154) com.microsoft.fabric.platform.PbiPlatformClient.newGetRequest(PbiPlatformClient.scala:51) com.microsoft.fabric.platform.PbiPlatformClient.newGetRequest$(PbiPlatformClient.scala:47) com.microsoft.fabric.platform.PbiPlatformInternalApiClient.newGetRequest(PbiPlatformClient.scala:175) com.microsoft.fabric.platform.PbiPlatformInternalApiClient.getAllWorkspaces(PbiPlatformClient.scala:199) com.microsoft.fabric.platform.InstrumentedPbiPlatformClient.$anonfun$getAllWorkspaces$1(PbiPlatformClient.scala:164) com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29) com.microsoft.fabric.platform.InstrumentedPbiPlatformClient.getAllWorkspaces(PbiPlatformClient.scala:164) com.microsoft.fabric.platform.PbiPlatformCachingClient.$anonfun$workspaceCache$1(PbiPlatformClient.scala:117) com.google.common.base.Suppliers$ExpiringMemoizingSupplier.get(Suppliers.java:192) com.microsoft.fabric.platform.PbiPlatformCachingClient.getWorkspace(PbiPlatformClient.scala:146) com.microsoft.fabric.platform.PbiPlatformCachingClient.getArtifacts(PbiPlatformClient.scala:136) com.microsoft.fabric.platform.PbiPlatformCachingClient.$anonfun$artifactCache$1(PbiPlatformClient.scala:130) com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145) com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406) java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1908) com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404) com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387) com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108) com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56) com.microsoft.fabric.platform.PbiPlatformCachingClient.getArtifact(PbiPlatformClient.scala:151) com.microsoft.fabric.spark.metadata.SchemaPathResolver.getArtifactRoot(pathResolvers.scala:127) com.microsoft.fabric.spark.metadata.SchemaPathResolver.getSchemaRoot(pathResolvers.scala:144) com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.listSchemas(DefaultSchemaMetadataManager.scala:218) com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.$anonfun$defaultSchemaPathResolver$1(DefaultSchemaMetadataManager.scala:30) com.microsoft.fabric.spark.metadata.NamespaceResolver.$anonfun$decodedSchemaNameCache$1(pathResolvers.scala:46) com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145) com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406) java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1908) com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404) com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387) com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108) com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56) com.microsoft.fabric.spark.metadata.Helpers$.forceLoadIfRequiredInCachedMap(Helpers.scala:61) com.microsoft.fabric.spark.metadata.NamespaceResolver.inferNamespace(pathResolvers.scala:87) com.microsoft.fabric.spark.metadata.NamespaceResolver.$anonfun$toNamespace$1(pathResolvers.scala:79) java.base/java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1705) com.microsoft.fabric.spark.metadata.NamespaceResolver.toNamespace(pathResolvers.scala:79) com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.getSchema(DefaultSchemaMetadataManager.scala:73) com.microsoft.fabric.spark.metadata.MetadataManager.getSchema(MetadataManager.scala:192) com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.super$getSchema(MetadataManager.scala:321) com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.$anonfun$getSchema$1(MetadataManager.scala:321) com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29) com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.getSchema(MetadataManager.scala:321) com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.getDatabase(OnelakeExternalCatalog.scala:78) com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.databaseExists(OnelakeExternalCatalog.scala:84) com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.$anonfun$databaseExists$1(OnelakeExternalCatalog.scala:417) scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29) com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.databaseExists(OnelakeExternalCatalog.scala:417) org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:169) org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:142) org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:54) org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:69) org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:140) org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:140) org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:363) org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireDbExists(SessionCatalog.scala:285) org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableRawMetadata(SessionCatalog.scala:622) org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:606) org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.loadTable(V2SessionCatalog.scala:80) org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.loadTable(DelegatingCatalogExtension.java:73) org.apache.spark.sql.delta.catalog.DeltaCatalog.super$loadTable(DeltaCatalog.scala:175) org.apache.spark.sql.delta.catalog.DeltaCatalog.$anonfun$loadTable$1(DeltaCatalog.scala:175) org.apache.spark.sql.delta.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:141) org.apache.spark.sql.delta.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:139) org.apache.spark.sql.delta.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:57) org.apache.spark.sql.delta.catalog.DeltaCatalog.loadTable(DeltaCatalog.scala:174) org.apache.spark.sql.connector.catalog.CatalogV2Util$.getTable(CatalogV2Util.scala:355) org.apache.spark.sql.connector.catalog.CatalogV2Util$.loadTable(CatalogV2Util.scala:336) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$3(Analyzer.scala:1349) scala.Option.orElse(Option.scala:447) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$1(Analyzer.scala:1344) scala.Option.orElse(Option.scala:447) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$resolveRelation(Analyzer.scala:1333) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1196) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1160) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:138) org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:138) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130) org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:32) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$2(AnalysisHelper.scala:135) org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1250) org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1249) org.apache.spark.sql.catalyst.plans.logical.Project.mapChildren(basicLogicalOperators.scala:69) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:135) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130) org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:32) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$2(AnalysisHelper.scala:135) org.apache.spark.sql.catalyst.trees.BinaryLike.mapChildren(TreeNode.scala:1278) org.apache.spark.sql.catalyst.trees.BinaryLike.mapChildren$(TreeNode.scala:1275) org.apache.spark.sql.catalyst.plans.logical.CreateTableAsSelect.mapChildren(v2Commands.scala:433) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:135) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130) org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:32) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:1160) org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:1119) org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:222) scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) scala.collection.immutable.List.foldLeft(List.scala:91) org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:219) org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:211) scala.collection.immutable.List.foreach(List.scala:431) org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:211) org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:230) org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:226) org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:175) org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:226) org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:190) org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:182) org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:93) org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:182) org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:211) org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:330) org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:210) org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:120) org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:120) org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:288) org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:642) org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:288) org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827) org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:287) org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:120) org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:118) org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:110) org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98) org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827) org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:640) org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827) org.apache.spark.sql.SparkSession.sql(SparkSession.scala:630) org.apache.spark.sql.SparkSession.sql(SparkSession.scala:671) org.apache.livy.repl.SQLInterpreter.execute(SQLInterpreter.scala:163) org.apache.livy.repl.Session.$anonfun$executeCode$1(Session.scala:868) scala.Option.map(Option.scala:230) org.apache.livy.repl.Session.executeCode(Session.scala:865) org.apache.livy.repl.Session.$anonfun$execute$10(Session.scala:569) org.apache.livy.repl.Session.withRealtimeOutputSupport(Session.scala:1094) org.apache.livy.repl.Session.$anonfun$execute$3(Session.scala:569) scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829)

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,984 questions
0 comments No comments
{count} votes

7 answers

Sort by: Most helpful
  1. Siva Reddy 40 Reputation points
    2024-08-14T07:11:10.3333333+00:00

    Hi @Bhargava-MSFT
    Thank You for the reply. I can see the admin access for me. But, still i can't create a delta table in the lakehouse tables using notebook.

    1 person found this answer helpful.

  2. Bhargava-MSFT 31,116 Reputation points Microsoft Employee
    2024-08-13T22:22:43.47+00:00

    Hello Siva Reddy,

    403 Forbidden error generally occurs when the user or service principal used to authenticate the request does not have the necessary permissions to create tables in the Lakehouse.

    Please make sure that the user or service principal used to authenticate the request has the necessary permissions to create tables in the lakehouse. You can check this by reviewing the access control settings for the lakehouse(provide Storage Blob Data Contributor Role on your data lake for your service principal).

    For the testing purpose, you can try using a different authentication method that has the required permissions.

    I hope this helps.

    0 comments No comments

  3. Abdourahmane Bah 10 Reputation points
    2024-08-26T17:58:19.24+00:00

    I'm encountering the same issue while trying to create a table in my lakehouse using a notebook, even though I am the owner of the lakehouse.

    Do you have any solution?


  4. Madhur Sawant 0 Reputation points
    2024-08-30T15:31:20.1366667+00:00

    Same here , trying to execute the training here - https://microsoftlearning.github.io/mslearn-fabric/Instructions/Labs/03-delta-lake.html and can not write delta tables. A training account for Fabric should have come up with all relevant permissions granted


  5. Deleted

    This answer has been deleted due to a violation of our Code of Conduct. The answer was manually reported or identified through automated detection before action was taken. Please refer to our Code of Conduct for more information.


    Comments have been turned off. Learn more

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.