Reading Time: 13 minutes

Notebooks in Microsoft Fabric

If you’re new to Microsoft Fabric and feeling a bit overwhelmed by all the tools at your disposal, you’re in the right place.

If you’re not quite sure what Microsoft Fabric is yet, I highly recommend checking out my introductory series on Microsoft Fabric before diving in.

Notebooks in Fabric are like your personal playground for coding, data wrangling, and even building machine learning models. They’re built on Apache Spark, making them perfect for data engineers and scientists alike. In this guide, we’ll walk through the basics of using notebooks, from creation to advanced features, sprinkled with handy tips and tricks to make your life easier. We’ll draw from Microsoft’s official documentation to keep things accurate and up-to-date.

Whether you’re ingesting data, transforming it, or experimenting with ML, notebooks offer an interactive, web-based environment that’s collaborative and powerful. Let’s dive in!

What Are Notebooks in Microsoft Fabric?

At their core, notebooks are interactive documents where you can mix executable code, visualizations, and explanatory text. Think of them as a blend of a code editor, a report builder, and a collaboration tool—all powered by Apache Spark for handling big data.

  • For Data Engineers: Use them to ingest, prepare, and transform data seamlessly.
  • For Data Scientists: Experiment with machine learning models, track progress, and deploy solutions.
  • Key Perks: Real-time visualizations, Markdown for documentation, and tight integration with Fabric’s ecosystem like lakehouses and pipelines.

Tip: If you’re coming from Jupyter Notebooks, you’ll feel right at home—Fabric supports importing .ipynb files directly!

Getting Started: Creating Your First Notebook

Starting is simple—no need for complex setups.

  1. Head to the Data Engineering homepage in Fabric.
  2. Click New in your workspace or use the Create Hub.
  3. Select Notebook, give it a name, and boom—you’re in!

You can also import existing notebooks:

  • From your local machine: Use the workspace toolbar to upload .ipynb, .py, .scala, or .sql files. Fabric converts them automatically.

Trick: Always start with a blank notebook for practice. Name it something descriptive like “MyFirstDataTransform” to keep your workspace organized.

Editing and Saving: The Basics

Once created, your notebook opens in Develop mode (if you have edit permissions). Here’s the lowdown:

  • Autosave is On by Default: Edits save automatically after you start working. No more losing progress!
  • Switch to Manual Save: Go to Edit > Save options > Manual if you prefer control. Then use Ctrl+S or the Save button.
  • Save a Copy: Clone your notebook to experiment without messing up the original—great for testing variations.

Tip: In a team setting, toggle to Run Only or View mode to avoid accidental changes when reviewing someone else’s work.

Trick: Use Save a Copy to create branches for different experiments, like one for data cleaning and another for visualization tweaks.

Working with Cells: Code and Markdown Magic

Notebooks are made of cells—building blocks for your content.

  • Code Cells: Write and run code in languages like Python, Scala, or SQL. Right-click files in the lakehouse explorer to auto-generate code snippets (e.g., loading a CSV with Spark or Pandas).
  • Markdown Cells: Add text, headings, lists, or even images for explanations. Perfect for documenting your thought process.

To run a cell: Hit the play button or use shortcuts (more on those later).

Tip: Start every notebook with a Markdown cell outlining your goals—it keeps you focused and helps collaborators understand your flow.

Trick: Use magic commands (like %%sql for SQL queries or %%pyspark for PySpark code) to switch contexts quickly without restarting sessions. This is a game-changer for mixing languages in one notebook!

Integrating with Lakehouses and Managing Files

Fabric shines in data integration—notebooks connect seamlessly to lakehouses for file and table access.

  • Add a Lakehouse: From the Lakehouse explorer, attach an existing one or create new. Pin it as default for easy paths (e.g., read files like they’re local).
  • Browse and Operate: In the Lake view, explore tables and files. Right-click to copy paths or generate load code.
  • Resource Folders:
    • Built-in Resources: Per-notebook storage for small files (up to 500 MB total). Upload, download, or access via relative paths.
    • Environment Resources: Shared across notebooks in the same environment—ideal for common scripts.

Need to edit a file? Use the built-in File Editor for CSV, TXT, PY, etc. (up to 1 MB). Save with Ctrl+S.

Tip: After pinning or renaming a lakehouse, restart your Spark session to avoid path errors.

Trick: Drag and drop files into the resources folder for quick uploads. Use notebookutils.nbResPath() in code to grab absolute paths dynamically—saves time debugging!

Running Code: Sessions, Security, and Best Practices

Running code is interactive and secure:

  • Interactive Runs: Manual execution under your user context.
  • In Pipelines or Schedules: Runs under the pipeline editor’s or schedule creator’s identity—double-check permissions!

First-time users get a warning: Review code before running to avoid surprises.

Tip: For big jobs, monitor Spark sessions in the UI to spot bottlenecks early.

Trick: Use workspace stages (dev/test/prod) to test notebooks safely without risking production data. Always review version history before executing shared code.

Keyboard Shortcuts to Boost Productivity

Who doesn’t love shortcuts? Here are essentials:

  • Ctrl+S: Save (in manual mode).
  • In the file editor: Standard code navigation and editing keys work, with syntax highlighting.

Tip: Learn cell-specific shortcuts like Shift+Enter to run and move to the next cell—speeds up iterative testing.

Trick: Customize your workflow by combining shortcuts with magic commands for ultra-efficient debugging.

Collaboration: Team Up in Real Time

Notebooks aren’t solo affairs:

  • Co-editing: Multiple users edit simultaneously—see cursors, selections, and live changes.
  • Sharing: Grant Edit, Run, or Share permissions via the toolbar.
  • Comments: Add threaded discussions on cells. Tag @users for notifications (emails sent if needed).

Tip: Use comments for feedback loops in team projects—it beats endless email threads.

Trick: For pair programming, share in Develop mode and use real-time visibility to debug together remotely.

Version History: Track Changes Like a Pro

In preview, but super useful:

  • Checkpoints: Auto every 5 minutes, or manual for milestones.
  • Diff View: Compare versions to see changes in code, output, and metadata.
  • Restore or Copy: Roll back or branch from old versions.

Integrates with Git, VS Code, and pipelines for multi-source tracking.

Tip: Create manual checkpoints before major experiments—easy rollback if things go sideways.

Trick: Label versions descriptively (e.g., “Added ML Model v1”) to make history navigation a breeze.

Troubleshooting Common Hiccups

  • Session Issues: Restart after lakehouse changes.
  • File Limits: Stick to 100 MB per file in resources; use lakehouses for bigger stuff.
  • Permissions: Ensure collaborators have access to tagged resources.
  • No Autosave in Editor: Always Ctrl+S when editing files.

Best Practice: Verify the “last modified by” user in pipelines to maintain security.

Top Tips and Tricks for New Learners

Here’s a roundup to accelerate your learning:

  • Start Small: Begin with simple data loads from a lakehouse to build confidence.
  • Visualize Early: Use libraries like Matplotlib in code cells for quick charts—Fabric handles rich outputs beautifully.
  • Experiment with Modes: Switch between Develop and Run Only to test execution without edits.
  • Leverage Integrations: Mount lakehouses as defaults to simplify paths; it’s a huge time-saver.
  • Security First: Always scan shared notebooks via version history.
  • Resource Optimization: Use shared environment folders for reusable code modules across projects.
  • Pro Debugging: Tag comments during co-edits for targeted fixes.
  • Bonus: If stuck, check Fabric’s troubleshooting sections or community forums for real-world advice.

Wrapping Up

Congratulations—you’re now equipped to tackle notebooks in Microsoft Fabric like a seasoned pro! They’re not just tools; they’re your gateway to efficient, collaborative data work. Practice with a sample dataset, experiment freely, and soon you’ll be building pipelines and ML models effortlessly.

Stay tuned for more advanced guides and real-world scenarios on Microsoft Fabric. Subscribe to the newsletter and keep exploring the world of data. 🚀

Leave a Comment

Scroll to Top
×
Your Cart
Cart is empty.
Fill your cart with amazing items
Shop Now
$0.00
Shipping & taxes may be re-calculated at checkout
$0.00