Explore how JavaScript's generator protocol extensions empower developers to create sophisticated, highly efficient, and composable iteration patterns. This comprehensive guide covers `yield*`, generator `return` values, sending values with `next()`, and advanced error handling and termination methods.
JavaScript Generator Protocol Extension: Mastering the Enhanced Iterator Interface
In the dynamic world of JavaScript, efficient data processing and control flow management are paramount. Modern applications constantly deal with streams of data, asynchronous operations, and complex sequences, demanding robust and elegant solutions. This comprehensive guide delves into the fascinating realm of JavaScript Generators, specifically focusing on their protocol extensions that elevate the humble iterator to a powerful, versatile tool. We'll explore how these enhancements empower developers to craft highly efficient, composable, and readable code for a myriad of complex scenarios, from data pipelines to asynchronous workflows.
Before we embark on this journey into advanced generator capabilities, let's briefly revisit the foundational concepts of iterators and iterables in JavaScript. Understanding these core building blocks is crucial for appreciating the sophistication that generators bring to the table.
The Foundations: Iterables and Iterators in JavaScript
At its heart, the concept of iteration in JavaScript revolves around two fundamental protocols:
- The Iterable Protocol: Defines how an object can be iterated over using a
for...ofloop. An object is iterable if it has a method named[Symbol.iterator]that returns an iterator. - The Iterator Protocol: Defines how an object produces a sequence of values. An object is an iterator if it has a
next()method that returns an object with two properties:value(the next item in the sequence) anddone(a boolean indicating if the sequence has finished).
Understanding the Iterable Protocol (Symbol.iterator)
Any object that possesses a method accessible via the [Symbol.iterator] key is considered an iterable. This method, when called, must return an iterator. Built-in types like Arrays, Strings, Maps, and Sets are all naturally iterable.
Consider a simple array:
const myArray = [1, 2, 3];
const iterator = myArray[Symbol.iterator]();
console.log(iterator.next()); // { value: 1, done: false }
console.log(iterator.next()); // { value: 2, done: false }
console.log(iterator.next()); // { value: 3, done: false }
console.log(iterator.next()); // { value: undefined, done: true }
The for...of loop internally utilizes this protocol to iterate over values. It automatically calls [Symbol.iterator]() once to get the iterator, and then repeatedly calls next() until done becomes true.
Understanding the Iterator Protocol (next(), value, done)
An object adhering to the Iterator Protocol provides a next() method. Each call to next() returns an object with two key properties:
value: The actual data item from the sequence. This can be any JavaScript value.done: A boolean flag.falseindicates there are more values to produce;trueindicates the iteration is complete, andvaluewill often beundefined(though it can technically be any final result).
Manually implementing an iterator can be verbose:
function createRangeIterator(start, end) {
let current = start;
return {
next() {
if (current <= end) {
return { value: current++, done: false };
} else {
return { value: undefined, done: true };
}
}
};
}
const range = createRangeIterator(1, 3);
console.log(range.next()); // { value: 1, done: false }
console.log(range.next()); // { value: 2, done: false }
console.log(range.next()); // { value: 3, done: false }
console.log(range.next()); // { value: undefined, done: true }
Generators: Simplifying Iterator Creation
This is where generators shine. Introduced in ECMAScript 2015 (ES6), generator functions (declared with function*) provide a much more ergonomic way to write iterators. When a generator function is called, it doesn't execute its body immediately; instead, it returns a Generator Object. This object itself conforms to both the Iterable and Iterator Protocols.
The magic happens with the yield keyword. When yield is encountered, the generator pauses execution, returns the yielded value, and saves its state. When next() is called again on the generator object, execution resumes from where it left off, continuing until the next yield or the function body completes.
A Simple Generator Example
Let's rewrite our createRangeIterator using a generator:
function* rangeGenerator(start, end) {
for (let i = start; i <= end; i++) {
yield i;
}
}
const myRange = rangeGenerator(1, 3);
console.log(myRange.next()); // { value: 1, done: false }
console.log(myRange.next()); // { value: 2, done: false }
console.log(myRange.next()); // { value: 3, done: false }
console.log(myRange.next()); // { value: undefined, done: true }
// Generators are also iterable, so you can use for...of directly:
console.log("Using for...of:");
for (const num of rangeGenerator(4, 6)) {
console.log(num); // 4, 5, 6
}
Notice how much cleaner and more intuitive the generator version is compared to the manual iterator implementation. This fundamental capability alone makes generators incredibly useful. But there's more – much more – to their power, especially when we delve into their protocol extensions.
The Enhanced Iterator Interface: Generator Protocol Extensions
The "extension" part of the generator protocol refers to capabilities that go beyond simply yielding values. These enhancements provide mechanisms for greater control, composition, and communication within and between generators and their callers. Specifically, we'll explore yield* for delegation, sending values back into generators, and terminating generators gracefully or with errors.
1. yield*: Delegation to Other Iterables
The yield* (yield-star) expression is a powerful feature that allows a generator to delegate to another iterable object. This means it can effectively "yield all" the values from another iterable, pausing its own execution until the delegated iterable is exhausted. This is incredibly useful for composing complex iteration patterns from simpler ones, promoting modularity and reusability.
How yield* Works
When a generator encounters yield* iterable, it performs the following:
- It retrieves the iterator from the
iterableobject. - It then starts yielding each value produced by that inner iterator.
- Any value sent back into the delegating generator via its
next()method is passed through to the delegated iterator'snext()method. - If the delegated iterator throws an error, that error is thrown back into the delegating generator.
- Crucially, when the delegated iterator finishes (its
next()returns{ done: true, value: X }), the valueXbecomes the return value of theyield*expression itself in the delegating generator. This allows inner iterators to communicate a final result back.
Practical Example: Combining Iteration Sequences
function* naturalNumbers() {
yield 1;
yield 2;
yield 3;
}
function* evenNumbers() {
yield 2;
yield 4;
yield 6;
}
function* combinedNumbers() {
console.log("Starting natural numbers...");
yield* naturalNumbers(); // Delegates to naturalNumbers generator
console.log("Finished natural numbers, starting even numbers...");
yield* evenNumbers(); // Delegates to evenNumbers generator
console.log("All numbers processed.");
}
const combined = combinedNumbers();
for (const num of combined) {
console.log(num);
}
// Output:
// Starting natural numbers...
// 1
// 2
// 3
// Finished natural numbers, starting even numbers...
// 2
// 4
// 6
// All numbers processed.
As you can see, yield* seamlessly merges the output of naturalNumbers and evenNumbers into a single, continuous sequence, while the delegating generator manages the overall flow and can inject additional logic or messages around the delegated sequences.
yield* with Return Values
One of the most powerful aspects of yield* is its ability to capture the final return value of the delegated iterator. A generator can return a value explicitly using a return statement. This value is captured by the value property of the last next() call, but also by the yield* expression if it's delegating to that generator.
function* processData(data) {
let sum = 0;
for (const item of data) {
sum += item;
yield item * 2; // Yield processed item
}
return sum; // Return the sum of original data
}
function* analyzePipeline(rawData) {
console.log("Starting data processing...");
// yield* captures the return value of processData
const totalSum = yield* processData(rawData);
console.log(`Original data sum: ${totalSum}`);
yield "Processing complete!";
return `Final sum reported: ${totalSum}`;
}
const pipeline = analyzePipeline([10, 20, 30]);
let result = pipeline.next();
while (!result.done) {
console.log(`Pipeline output: ${result.value}`);
result = pipeline.next();
}
console.log(`Final pipeline result: ${result.value}`);
// Expected Output:
// Starting data processing...
// Pipeline output: 20
// Pipeline output: 40
// Pipeline output: 60
// Original data sum: 60
// Pipeline output: Processing complete!
// Final pipeline result: Final sum reported: 60
Here, processData not only yields transformed values but also returns the sum of the original data. analyzePipeline uses yield* to consume the transformed values and simultaneously captures that sum, enabling the delegating generator to react to or utilize the final result of the delegated operation.
Advanced Use Case: Tree Traversal
yield* is excellent for recursive structures like trees.
class TreeNode {
constructor(value) {
this.value = value;
this.children = [];
}
addChild(node) {
this.children.push(node);
}
// Making the node iterable for a depth-first traversal
*[Symbol.iterator]() {
yield this.value; // Yield current node's value
for (const child of this.children) {
yield* child; // Delegate to children for their traversal
}
}
}
const root = new TreeNode('A');
const nodeB = new TreeNode('B');
const nodeC = new TreeNode('C');
const nodeD = new TreeNode('D');
const nodeE = new TreeNode('E');
root.addChild(nodeB);
root.addChild(nodeC);
nodeB.addChild(nodeD);
nodeC.addChild(nodeE);
console.log("Tree traversal (Depth-First):");
for (const val of root) {
console.log(val);
}
// Output:
// Tree traversal (Depth-First):
// A
// B
// D
// C
// E
This elegantly implements a depth-first traversal using yield*, showcasing its power for recursive iteration patterns.
2. Sending Values into a Generator: The next() Method with Arguments
One of the most striking "protocol extensions" for generators is their bidirectional communication capability. While yield sends values out of a generator, the next() method can also accept an argument, allowing you to send values back into a paused generator. This transforms generators from simple data producers into powerful coroutine-like constructs capable of pausing, receiving input, processing, and resuming.
How it Works
When you call generatorObject.next(valueToInject), the valueToInject becomes the result of the yield expression that caused the generator to pause. If the generator was not paused by a yield (e.g., it was just started or had finished), the injected value is ignored.
function* interactiveProcess() {
const input1 = yield "Please provide the first number:";
console.log(`Received first number: ${input1}`);
const input2 = yield "Now, provide the second number:";
console.log(`Received second number: ${input2}`);
const sum = Number(input1) + Number(input2);
yield `The sum is: ${sum}`;
return "Process complete.";
}
const process = interactiveProcess();
// First next() call starts the generator, the argument is ignored.
// It yields the first prompt.
let response = process.next();
console.log(response.value); // Please provide the first number:
// Send the first number back into the generator
response = process.next(10);
console.log(response.value); // Now, provide the second number:
// Send the second number back
response = process.next(20);
console.log(response.value); // The sum is: 30
// Complete the process
response = process.next();
console.log(response.value); // Process complete.
console.log(response.done); // true
This example clearly demonstrates how the generator pauses, prompts for input, and then receives that input to continue its execution. This is a fundamental pattern for building sophisticated interactive systems, state machines, and more complex data transformations where the next step depends on external feedback.
Use Cases for Bidirectional Communication
- Coroutines and Cooperative Multitasking: Generators can act as light-weight coroutines, voluntarily yielding control and receiving data, useful for managing complex state or long-running tasks without blocking the main thread (when combined with event loops or
setTimeout). - State Machines: The generator's internal state (local variables, program counter) is preserved across
yieldcalls, making them ideal for modeling state machines where transitions are triggered by external inputs. - Input/Output (I/O) Simulation: For simulating asynchronous operations or user input,
next()with arguments provides a synchronous way to test and control the flow of a generator. - Data Transformation Pipelines with External Configuration: Imagine a pipeline where certain processing steps need parameters that are determined dynamically during execution.
3. throw() and return() Methods on Generator Objects
Beyond next(), generator objects also expose throw() and return() methods, which provide additional control over their execution flow from the outside. These methods allow external code to inject errors or force early termination, significantly enhancing error handling and resource management in complex generator-based systems.
generatorObject.throw(exception): Injecting Errors
Calling generatorObject.throw(exception) injects an exception into the generator at its current paused state. This exception behaves exactly like an throw statement within the generator's body. If the generator has a try...catch block around the yield statement where it was paused, it can catch and handle this external error.
If the generator does not catch the exception, it propagates out to the caller of throw(), just as any unhandled exception would.
function* dataProcessor() {
try {
const data = yield "Waiting for data...";
console.log(`Processing: ${data}`);
if (typeof data !== 'number') {
throw new Error("Invalid data type: expected number.");
}
yield `Data processed: ${data * 2}`;
} catch (error) {
console.error(`Caught error inside generator: ${error.message}`);
return "Error handled and generator terminated."; // Generator can return a value on error
} finally {
console.log("Generator cleanup complete.");
}
}
const processor = dataProcessor();
console.log(processor.next().value); // Waiting for data...
// Simulate an external error being thrown into the generator
console.log("Attempting to throw an error into the generator...");
let resultWithError = processor.throw(new Error("External interruption!"));
console.log(`Result after external error: ${resultWithError.value}`); // Error handled and generator terminated.
console.log(`Done after error: ${resultWithError.done}`); // true
console.log("\n--- Second attempt with valid data, then an internal type error ---");
const processor2 = dataProcessor();
console.log(processor2.next().value); // Waiting for data...
console.log(processor2.next(5).value); // Data processed: 10
// Now, send invalid data, which will cause an internal throw
let resultInvalidData = processor2.next("abc");
// The generator will catch its own throw
console.log(`Result after invalid data: ${resultInvalidData.value}`); // Error handled and generator terminated.
console.log(`Done after error: ${resultInvalidData.done}`); // true
The throw() method is invaluable for propagating errors from an external event loop or promise chain back into a generator, enabling unified error handling across asynchronous operations managed by generators.
generatorObject.return(value): Forceful Termination
The generatorObject.return(value) method allows you to prematurely terminate a generator. When called, the generator immediately completes, and its next() method will subsequently return { value: value, done: true } (or { value: undefined, done: true } if no value is provided). Any finally blocks within the generator will still execute, ensuring proper cleanup.
function* resourceIntensiveOperation() {
try {
let count = 0;
while (true) {
yield `Processing item ${++count}`;
// Simulate some heavy work
if (count > 50) { // Safety break
return "Processed many items, returning.";
}
}
} finally {
console.log("Resource cleanup for intensive operation.");
}
}
const op = resourceIntensiveOperation();
console.log(op.next().value); // Processing item 1
console.log(op.next().value); // Processing item 2
console.log(op.next().value); // Processing item 3
// Decided to stop early
console.log("External decision: terminating operation early.");
let finalResult = op.return("Operation cancelled by user.");
console.log(`Final result after termination: ${finalResult.value}`); // Operation cancelled by user.
console.log(`Done: ${finalResult.done}`); // true
// Subsequent calls will show it's done
console.log(op.next()); // { value: undefined, done: true }
This is extremely useful for scenarios where external conditions dictate that a long-running or resource-consuming iterative process needs to be halted gracefully, such as user cancellation or reaching a certain threshold. The finally block ensures that any allocated resources are properly released, preventing leaks.
Advanced Patterns and Global Use Cases
The generator protocol extensions lay the groundwork for some of the most powerful patterns in modern JavaScript, particularly in managing asynchronicity and complex data flows. While the core concepts remain the same globally, their application can greatly simplify development across diverse international projects.
Asynchronous Iteration with Async Generators and for await...of
Building upon the iterator and generator protocols, ECMAScript introduced Async Generators and the for await...of loop. These provide a synchronous-looking way to iterate over asynchronous data sources, treating streams of promises or network responses as if they were simple arrays.
The Async Iterator Protocol
Just like their synchronous counterparts, async iterables have a [Symbol.asyncIterator] method that returns an async iterator. An async iterator has an async next() method that returns a promise that resolves to an object { value: ..., done: ... }.
Async Generator Functions (async function*)
An async function* automatically returns an async iterator. You use await within their bodies to pause execution for promises and yield to produce values asynchronously.
async function* fetchPaginatedData(url) {
let nextPage = url;
while (nextPage) {
const response = await fetch(nextPage);
const data = await response.json();
yield data.results; // Yield results from the current page
// Assume API indicates the next page URL
nextPage = data.next_page_url;
if (nextPage) {
console.log(`Fetching next page: ${nextPage}`);
}
await new Promise(resolve => setTimeout(resolve, 100)); // Simulate network delay for next fetch
}
return "All pages fetched.";
}
// Example usage:
async function processAllData() {
console.log("Starting data fetching...");
try {
for await (const pageResults of fetchPaginatedData("https://api.example.com/items?page=1")) {
console.log("Processed a page of results:", pageResults.length, "items.");
// Imagine processing each page of data here
// e.g., storing in a database, transforming for display
for (const item of pageResults) {
console.log(` - Item ID: ${item.id}`);
}
}
console.log("Finished all data fetching and processing.");
} catch (error) {
console.error("An error occurred during data fetching:", error.message);
}
}
// In a real application, replace with a dummy URL or mock fetch
// For this example, let's just illustrate the structure with a placeholder:
// (Note: `fetch` and actual URLs would require a browser or Node.js environment)
// await processAllData(); // Call this in an async context
This pattern is profoundly powerful for handling any sequence of asynchronous operations where you want to process items one by one, without waiting for the entire stream to complete. Think about:
- Reading large files or network streams chunk by chunk.
- Processing data from paginated APIs efficiently.
- Building real-time data processing pipelines.
Globally, this approach standardizes how developers can consume and produce asynchronous data streams, fostering consistency across different backend and frontend environments.
Generators as State Machines and Coroutines
The ability of generators to pause and resume, combined with bidirectional communication, makes them excellent tools for building explicit state machines or lightweight coroutines.
function* vendingMachine() {
let balance = 0;
yield "Welcome! Insert coins (values: 1, 2, 5).";
while (true) {
const coin = yield `Current balance: ${balance}. Waiting for coin or "buy".`;
if (coin === "buy") {
if (balance >= 5) { // Assuming item costs 5
balance -= 5;
yield `Here is your item! Change: ${balance}.`;
} else {
yield `Insufficient funds. Need ${5 - balance} more.`;
}
} else if ([1, 2, 5].includes(Number(coin))) {
balance += Number(coin);
yield `Inserted ${coin}. New balance: ${balance}.`;
} else {
yield "Invalid input. Please insert 1, 2, 5, or 'buy'.";
}
}
}
const machine = vendingMachine();
console.log(machine.next().value); // Welcome! Insert coins (values: 1, 2, 5).
console.log(machine.next().value); // Current balance: 0. Waiting for coin or "buy".
console.log(machine.next(2).value); // Inserted 2. New balance: 2.
console.log(machine.next(5).value); // Inserted 5. New balance: 7.
console.log(machine.next("buy").value); // Here is your item! Change: 2.
console.log(machine.next("buy").value); // Current balance: 2. Waiting for coin or "buy".
console.log(machine.next("exit").value); // Invalid input. Please insert 1, 2, 5, or 'buy'.
This vending machine example illustrates how a generator can maintain internal state (balance) and transition between states based on external input (coin or "buy"). This pattern is invaluable for game loops, UI wizards, or any process with well-defined sequential steps and interactions.
Building Flexible Data Transformation Pipelines
Generators, especially with yield*, are perfect for creating composable data transformation pipelines. Each generator can represent a processing stage, and they can be chained together.
function* filterEvens(numbers) {
for (const num of numbers) {
if (num % 2 === 0) {
yield num;
}
}
}
function* doubleValues(numbers) {
for (const num of numbers) {
yield num * 2;
}
}
function* sumUpTo(numbers, limit) {
let sum = 0;
for (const num of numbers) {
if (sum + num > limit) {
return sum; // Stop if adding next number exceeds limit
}
sum += num;
yield sum; // Yield cumulative sum
}
return sum;
}
// A pipeline orchestration generator
function* dataPipeline(data) {
console.log("Pipeline Stage 1: Filtering even numbers...");
// `yield*` here iterates, it does not capture a return value from filterEvens
// unless filterEvens explicitly returns one (which it does not by default).
// For truly composable pipelines, each stage should return a new generator or iterable directly.
// Chaining generators directly is often more functional:
const filteredAndDoubled = doubleValues(filterEvens(data));
console.log("Pipeline Stage 2: Summing up to a limit (100)...");
const finalSum = yield* sumUpTo(filteredAndDoubled, 100);
return `Final sum within limit: ${finalSum}`;
}
const rawData = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20];
const pipelineExecutor = dataPipeline(rawData);
let pipelineResult = pipelineExecutor.next();
while (!pipelineResult.done) {
console.log(`Intermediate pipeline output: ${pipelineResult.value}`);
pipelineResult = pipelineExecutor.next();
}
console.log(pipelineResult.value);
// Corrected chaining approach for illustration (direct functional composition):
console.log("\n--- Direct Chaining Example (Functional Composition) ---");
const processedNumbers = doubleValues(filterEvens(rawData)); // Chain iterables
let cumulativeSumIterator = sumUpTo(processedNumbers, 100); // Create an iterator from the last stage
for (const val of cumulativeSumIterator) {
console.log(`Cumulative Sum: ${val}`);
}
// The final return value of sumUpTo (if not consumed by for...of) would be accessed via .return() or .next() after done
console.log(`Final cumulative sum (from iterator's return value): ${cumulativeSumIterator.next().value}`);
// Expected output would show filtered, then doubled even numbers, then their cumulative sum up to 100.
// Example sequence for rawData [1,2,3...20] processed by filterEvens -> doubleValues -> sumUpTo(..., 100):
// Filtered evens: [2, 4, 6, 8, 10, 12, 14, 16, 18, 20]
// Doubled evens: [4, 8, 12, 16, 20, 24, 28, 32, 36, 40]
// Cumulative sum up to 100:
// Sum: 4
// Sum: 12 (4+8)
// Sum: 24 (12+12)
// Sum: 40 (24+16)
// Sum: 60 (40+20)
// Sum: 84 (60+24)
// Final cumulative sum (from iterator's return value): 84 (since adding 28 would exceed 100)
The corrected chaining example demonstrates how functional composition is naturally facilitated by generators. Each generator takes an iterable (or another generator) and produces a new iterable, allowing for highly flexible and efficient data processing. This approach is highly valued in environments dealing with large datasets or complex analytical workflows, common in various industries globally.
Best Practices for Using Generators
To leverage generators and their protocol extensions effectively, consider the following best practices:
- Keep Generators Focused: Each generator should ideally perform a single, well-defined task (e.g., filtering, mapping, fetching a page). This enhances reusability and testability.
- Clear Naming Conventions: Use descriptive names for generator functions and the values they
yield. For example,fetchUsersPage()orprocessCsvRows(). - Handle Errors Gracefully: Utilize
try...catchblocks within generators and be prepared to usegeneratorObject.throw()from external code to manage errors effectively, especially in asynchronous contexts. - Manage Resources with
finally: If a generator acquires resources (e.g., opening a file handle, establishing a network connection), use afinallyblock to ensure these resources are released, even if the generator terminates early viareturn()or an unhandled exception. - Prefer
yield*for Composition: When combining the output of multiple iterables or generators,yield*is the cleanest and most efficient way to delegate, making your code modular and easier to reason about. - Understand Bidirectional Communication: Be intentional when using
next()with arguments. It's powerful but can make generators harder to follow if not used judiciously. Document clearly when inputs are expected. - Consider Performance: While generators are efficient, especially for lazy evaluation, be mindful of excessively deep
yield*delegation chains or very frequentnext()calls in performance-critical loops. Profile if necessary. - Test Thoroughly: Test generators just like any other function. Verify the sequence of yielded values, the return value, and how they behave when
throw()orreturn()are called on them.
Impact on Modern JavaScript Development
The generator protocol extensions have had a profound impact on the evolution of JavaScript:
- Simplifying Asynchronous Code: Before
async/await, generators with libraries likecowere the primary mechanism for writing asynchronous code that looked synchronous. They paved the way for theasync/awaitsyntax we use today, which internally often leverages similar concepts of pausing and resuming execution. - Enhanced Data Streaming and Processing: Generators excel at processing large datasets or infinite sequences lazily. This means data is processed on demand, rather than loading everything into memory at once, which is crucial for performance and scalability in web applications, server-side Node.js, and data analytics tools.
- Promoting Functional Patterns: By providing a natural way to create iterables and iterators, generators facilitate more functional programming paradigms, enabling elegant composition of data transformations.
- Building Robust Control Flow: Their ability to pause, resume, receive input, and handle errors makes them a versatile tool for implementing complex control flows, state machines, and event-driven architectures.
In an increasingly interconnected global development landscape, where diverse teams collaborate on projects ranging from real-time data analytics platforms to interactive web experiences, generators offer a common, powerful language feature to tackle complex problems with clarity and efficiency. Their universal applicability makes them a valuable skill for any JavaScript developer worldwide.
Conclusion: Unlocking the Full Potential of Iteration
JavaScript Generators, with their extended protocol, represent a significant leap forward in how we manage iteration, asynchronous operations, and complex control flows. From the elegant delegation offered by yield* to the powerful bidirectional communication via next() arguments, and the robust error/termination handling with throw() and return(), these features provide developers with an unprecedented level of control and flexibility.
By understanding and mastering these enhanced iterator interfaces, you're not just learning a new syntax; you're gaining tools to write more efficient, more readable, and more maintainable code. Whether you're building sophisticated data pipelines, implementing intricate state machines, or streamlining asynchronous operations, generators offer a powerful and idiomatic solution.
Embrace the enhanced iterator interface. Explore its possibilities. Your JavaScript code – and your projects – will be all the better for it.