diff --git a/locale/fa/blog/advisory-board/advisory-board-update.md b/locale/fa/blog/advisory-board/advisory-board-update.md deleted file mode 100644 index 72f301cfe2b1..000000000000 --- a/locale/fa/blog/advisory-board/advisory-board-update.md +++ /dev/null @@ -1,108 +0,0 @@ ---- -title: Advisory Board Update -date: 2014-12-03T18:00:00.000Z -author: Timothy J Fontaine -slug: advisory-board-update -layout: blog-post.hbs ---- - -A lot has been happening in Node.js, so I wanted to bring everyone up to date on -where we are with regards to the advisory board, its working groups, and the -release of v0.12. - -The interim [advisory -board](https://www.joyent.com/blog/node-js-advisory-board) has met three times -since its creation. You can find the minutes from the advisory board meetings -here: [https://nodejs.org/en/about/advisory-board/](https://nodejs.org/en/about/advisory-board/). As -we have more meetings and minutes, we will announce the dates and times for -those meeting and their minutes here on the blog. The next meeting is this -Thursday December 4th, at 1:30PM PST. We're looking to collect as much feedback -and input from as many representatives of the community as we can, so it's -important that we keep everyone up to date as much as possible. - -The interim advisory board has been working through a series of topics (in -general meetings as well as working groups) to further hone the scope of the -board, as well as define the structure that the advisory board will use to -conduct its meetings. Everyone on the board wants to make sure we're being as -transparent as possible, so let me describe how things operate so far. The -board is using a traditional two conference call structure, a public portion -that is recorded and open for anyone to join, and a private portion that is -only for board members. - -The public portion is meant to provide an update of what happened in the -previous meeting, as well as the status of action items from the previous -meeting. At the end of each public session is a open comment section, where -listeners are able to ask questions and the advisory board can respond. - -Following the public portion the board dials into the private conference, -further discussion happens during this time around specific agenda items, -working groups providing updates, and facilitating conversations about those -topics. These conversations are open and frank, and their content is recorded -in the minutes. Those minutes are then published a few days after the meeting -in the GitHub repository -[https://github.com/joyent/nodejs-advisory-board](https://github.com/joyent/nodejs-advisory-board), -as well as on the website -[https://nodejs.org/en/about/advisory-board/](https://nodejs.org/en/about/advisory-board/). - -There are a few working groups so far, for instance one is focused on making -sure the membership of the board is representative of the community Node.js -serves. While the board was initially bootstrapped with its existing -membership, we want to quickly move to a model that fully represents our -community. We want the board to represent the broadest spectrum of our -community, that also enables the board to move swiftly and make progress. - -Another working group is having a conversation about governance. This includes -topics like what is the team that makes decisions for Node.js, how do you -become a member of that team, how does that team set the roadmap for the -project, and how does that team makes decisions. - -One thing that we all agree on, is that we're not going to be using the -Benevolent Dictator model. In fact, recently the project hasn't been operating -that way. We can be more clear about that in our -[documentation](https://nodejs.org/en/about/organization). We all agree we want -a healthy and vibrant team, a team focused on making progress for Node.js, not -for progress's sake, but for the betterment of the software project and the -community we serve. We also agree that this means that there should be -consensus among the team. The conversation has been fruitful, and is on going, -we're continuing to work through the finer points of how much consensus we -need. - -I want to take a moment to describe what consensus means in this context. The -consensus model is about accountability. Accountability for the changes being -integrated into the project, accountability for documentation, and -accountability for releases. While members of the team are responsible for -subsystems or features of Node.js, everyone reviews each others changes. They -make sure to understand the impact on their relevant responsibilities. - -The goal of the team, especially that of the project lead, is to drive -consensus and ensure accountability. This means asking critical questions and -being able to answer them specifically and succinctly, for example: - - * What are we trying to solve with this change? - * Does this change effectively solve for this problem? - * Does this API have a consumer? - * Does this API reach the broadest amount of use cases? - * Is this API supportable? - * Does this change have adverse effects on other subsystems or use cases (and is that acceptable)? - * Does this change have tests that verify its operation, now and in the future? - * Does this change pass our style guidelines? - * Does this change pass our integration tests for the matrix of our supported configurations? - - For instance: ia32 and x64 for Windows, Linux, OSX, SmartOS - -These are just some of the questions, and while the questions are not unusual -or unique to Node.js, they are still important. - -Finally, we are very close to releasing v0.12, there's only one major patch -we're waiting to land. Once that's done we'll be releasing v0.11.15 as a -release candidate. Assuming no severe issues are filed against v0.11.15 we will -be going live with v0.12 about two weeks after the v0.11.15 release. - -If you have questions for the advisory board you can email -[advisoryboard@nodejs.org](mailto:advisoryboard@nodejs.org) or file an issue on -its repository -[https://github.com/joyent/nodejs-advisory-board](https://github.com/joyent/nodejs-advisory-board). -Thanks for all of your continued contributions to Node.js, in the form of -[filing issues](https://github.com/joyent/node/issues), [submitting pull -requests](https://github.com/joyent/node/pulls), and publishing your modules. -Node.js is lucky to have such an enthusiastic and engaged community, and we're -excited to be working with you on the future of Node.js. diff --git a/locale/fa/blog/advisory-board/index.md b/locale/fa/blog/advisory-board/index.md deleted file mode 100644 index 40847a786963..000000000000 --- a/locale/fa/blog/advisory-board/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Advisory Board -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/advisory-board/listening-to-the-community.md b/locale/fa/blog/advisory-board/listening-to-the-community.md deleted file mode 100644 index a56801cdcac7..000000000000 --- a/locale/fa/blog/advisory-board/listening-to-the-community.md +++ /dev/null @@ -1,22 +0,0 @@ ---- -title: Listening to the Community -date: 2014-12-05T21:30:00.000Z -author: Advisory Board -slug: listening-to-the-community -layout: blog-post.hbs ---- - -We assembled the Node.js Advisory Board (AB) to listen to the community and -make the necessary changes to have a unified direction for Node.js, a -passionate group of developers, a vibrant ecosystem of product and service -providers, and a satisfied user base. Over the last month we have made great -progress on an open governance model, API standards, IP management, and -transparency to ensure the project is community-driven. These efforts -explicitly target helping resolve conflicts and with the goal of moving the -community forward together. It is important that we understand voices of -dissent and frustration and work together to build the greater ecosystem. We -are committed to this goal. - -Node.js remains the trusted platform that users rely on for creative projects -and to drive business goals. The v0.12 release will ship shortly and the -project team is already engaged in discussions about the next release. diff --git a/locale/fa/blog/announcements/apigee-rising-stack-yahoo.md b/locale/fa/blog/announcements/apigee-rising-stack-yahoo.md deleted file mode 100644 index f3d471876bad..000000000000 --- a/locale/fa/blog/announcements/apigee-rising-stack-yahoo.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -title: Apigee, RisingStack and Yahoo Join the Node.js Foundation -date: 2015-12-08T12:00:00.000Z -status: publish -category: Annoucements -slug: apigee-rising-stack-yahoo -layout: blog-post.hbs ---- - -> New Silver Members to Advance Node.js Growth and Enterprise Adoption - -**NODE.JS INTERACTIVE 2015, PORTLAND, OR.** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced Apigee, RisingStack and Yahoo are joining the Foundation as Silver Members to build and support the Node.js platform. With over 2 million downloads per month, Node.js is the runtime of choice for developers building everything from enterprise applications to Industrial IoT. - -The Node.js Foundation members work together alongside the community to help grow this diverse technology for large financial services, web-scale, cloud computing companies, and more. The newly added [Long-Term Support](https://nodejs.org/en/blog/release/v4.2.0/) release version 4.0 is just one of the many initiatives from the Foundation, which addresses the needs of enterprises that are using Node.js in more complex production environments, and signals the growing maturity of the technology. - -“We continue to welcome new Node.js Foundation members that are committed to providing the financial and technical resources needed to ensure the technology continues to evolve, while nurturing the community and ecosystem at the same time,” said Danese Cooper, Chairperson of the Node.js Foundation Board. “We are excited to have Apigee, RisingStack, and Yahoo on board to help grow the diversity of the platform and the community.” - -The new members are joining just in time for the inaugural Node.js Interactive event taking place today and tomorrow in Portland, OR. The conference focuses on frontend, backend and IoT technologies, and the next big initiatives for the Node.js Foundation. It includes more than 50 tutorials, sessions and keynotes. To stream the event, go to [http://events.linuxfoundation.org/events/node-interactive/program/live-video-stream](http://events.linuxfoundation.org/events/node-interactive/program/live-video-stream). - -More information about the newest Node.js Foundation members: - -[Apigee](https://apigee.com/about/) provides an intelligent API platform for digital businesses. Headquartered in San Jose, California, Apigee’s software supports some of the largest global enterprises. Developers can use the Node.js software platform to build highly customized application programming interfaces (APIs) and apps in the [Apigee API management platform](http://apigee.com/about/products/api-management). The integration of the Node.js technology allows developers to use code to create specialized APIs in Apigee, while utilizing the huge community of JavaScript developers. - -“We want to provide to the developer community the best platform for building today’s modern apps and APIs,,” said Ed Anuff, executive vice president of strategy at Apigee. “We are committed to the advancement of Node.js and look forward to continuing to utilize the strengths and further possibilities of the technology. The Node.js Foundation provides an excellent place for us to help push this technology to become even better for our developers that use it everyday.” - -[RisingStack](https://risingstack.com/) was founded in 2014 by Gergely Nemeth and Peter Marton as a full stack JavaScript consulting company. It provides help with digital transitioning to Node.js and offers a microservice monitoring tool called [Trace](http://trace.risingstack.com/). RisingStack also contributes to several open source projects, and engages the developer community via a popular JavaScript/DevOps [engineering blog](https://blog.risingstack.com/), with a tremendous amount of long reads. - -“Node.js is extremely important in JavaScript development, and we have experienced a rapid rise of interest in the technology from enterprises.” said Gergely Nemeth, CEO and Co-Founder of RisingStack. “Our business was established to support this growing technology, and we are very excited to join the Node.js Foundation to help broaden this already active community and continue its growth through open governance.” - -Yahoo is a guide focused on informing, connecting and entertaining its users. By creating highly personalized experiences for its users, Yahoo keeps people connected to what matters most to them, across devices and around the world. In turn, Yahoo creates value for advertisers by connecting them with the audiences that build their businesses. - -“Joining the Node.js Foundation underscores our deep appreciation for the Node.js community, and our commitment to drive its health and growth,” said Preeti Somal, vice president of engineering, Yahoo. “As a technology pioneer with a deep legacy of JavaScript expertise and a strong commitment to open source, we saw the promise of Node.js from the start and have since scaled to become one of the industry’s largest deployments. We embrace Node.js’s evolution and encourage our developers to be contributing citizens of the Open Source community.” - -Additional Resources -* Learn more about the [Node.js Foundation](https://foundation.nodejs.org/) and get involved with [the project](https://nodejs.org/en/get-involved/). - -About Node.js Foundation -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), YLD!, and Yahoo. Get involved here: [https://nodejs.org](https://nodejs.org/en/). diff --git a/locale/fa/blog/announcements/appdynamics-newrelic-opbeat-sphinx.md b/locale/fa/blog/announcements/appdynamics-newrelic-opbeat-sphinx.md deleted file mode 100644 index 14c24f093855..000000000000 --- a/locale/fa/blog/announcements/appdynamics-newrelic-opbeat-sphinx.md +++ /dev/null @@ -1,44 +0,0 @@ ---- -title: AppDynamics, New Relic, Opbeat and Sphinx Join the Node.js Foundation as Silver Members -date: 2016-03-09T21:00:00.000Z -category: Annoucements -slug: appdynamics-newrelice-opbeat-sphinx -layout: blog-post.hbs ---- - -> Foundation Announces Dates for Node.js Interactive Conferences in Amsterdam and Austin, Texas - -SAN FRANCISCO, Mar. 9, 2016 — The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced AppDynamics, New Relic, Opbeat and Sphinx are joining the Foundation as Silver Members to continue to sustain and grow the Node.js platform. - -Many of the new members are within the application performance management industry, both established and up-and-coming vendors. Application performance management is an essential part of any infrastructure and there is a need across public, private and hybrid clouds to ensure that current and future products offer next-generation application performance with Node.js as a core component to the stability and potential of these offerings. - -The new members have the opportunity to support a range of Foundation activities such as new training programs and user-focused events like the new, ongoing [Node.js Live](http://live.nodejs.org/) series and Node.js Interactive events. Expanding to Europe this year, Node.js Interactive will take place September 15-16 in Amsterdam, Netherlands; while the North America event will be held November 29-30, 2016, in Austin, Texas. More information on the conference to come. - -Node.js has grown tremendously in the last year with a total of [3.5 million](https://www.npmjs.com/) users and 100 percent year-over-year growth. It is becoming ubiquitous across numerous industries from financial services to media companies, and is heavily used by frontend, backend and IoT developers. - -“While the popularity of Node.js has dramatically increased in recent years, the Foundation is committed to maintaining a stable, neutral and transparent ground to support continuation of the technology’s growth,” said Danese Cooper, Chairperson of the Node.js Foundation Board. “We are pleased to have AppDynamics, New Relic, Opbeat and Sphinx join the Foundation to help support both continued expansion for the technology and stability needs of the community.” - -More About the New Members: - -[AppDynamics](http://www.appdynamics.com/) is the application intelligence company that provides real-time insights into application performance, user experience, and business outcomes with cloud, on-premises, and hybrid deployment flexibility. Seeing the growing popularity of Node.js as a platform for building fast and scalable web and mobile applications, AppDynamics created a Node.js monitoring solution built on their core APM platform. The solution helps customers monitor Node.js applications in real-time and diagnose performance bottlenecks while running in live production or development environments. - -“Node.js is clearly taking off, and we’ve seen significant adoption of the platform in production for quite some time now, especially within the enterprise. We have participated in multiple Node.js events in the past, and look forward to continuing to support the longevity of this project, which is important to the developers that we serve,” said AppDynamics Chief Technology Officer and Senior Vice President of Product Management, Bhaskar Sunkara. - -[New Relic](https://newrelic.com/) is a software analytics company that delivers real-time insights and helps companies securely monitor their production software in virtually any environment, without having to build or maintain dedicated infrastructure. New Relic’s agent helps pinpoint Node.js application performance issues across private, public or hybrid cloud environments. - -“We're seeing huge growth in our Node.js application counts on a daily basis, from customers of all sizes - there's just as much interest from the Fortune 100 as there is from new startups. New Relic's engineers have been contributing to Node.js's core development for years, and we're excited to help accelerate its advancement and success even further by supporting the Node.js Foundation," said Tim Krajcar, Engineering Manager, Node.js Agent, New Relic. - -[Opbeat](https://opbeat.com/) provides next-generation performance insights, specifically built for JavaScript developers. Opbeat maps production issues to the developers that write the code, leading to faster debugging and more coding. The young company recently launched [full support for Node.js](https://opbeat.com/nodejs/). - -“We’re seeing massive interest in Opbeat within the Node community - from larger organizations to smaller start-ups - so we’re excited to join the Foundation to help support the community. At the end of the day, our customers are developers and we want to contribute to the increased popularity of Node amongst developers and CTOs,” said Rasmus Makwarth, Co-Founder and CEO of Opbeat. - -[Sphinx](http://sphinx.sg/) was established in 2014 by experience Vietnamese developers from Silicon Valley with the aim to become the leading company on Node.js and the MEAN stack — the group has co-founded the Vietnamese Node.js and Angular.js communities. The consulting team helps take large-scale applications from the concept phase to production for some of the largest global enterprises and government departments. - -“Becoming a silver member is a breakthrough for us, and gives us the opportunity to establish long-lasting relationships with companies that also share a common interest in the rapidly growing Node.js technology. We look forward to collaborating with other Foundation members and continuing to develop and support the open source community,” said Hai Luong, CEO and Co-Founder of Sphinx. - -About Node.js Foundation -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 3.5 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, AppDynamics, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD!, and Yahoo!. Get involved here: [https://nodejs.org](https://nodejs.org/). - -The Node.js Foundation is a Linux Foundation Project, which are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [www.linuxfoundation.org](http://www.linuxfoundation.org/) diff --git a/locale/fa/blog/announcements/cars-dynatrace.md b/locale/fa/blog/announcements/cars-dynatrace.md deleted file mode 100644 index 3bb4deaad2ab..000000000000 --- a/locale/fa/blog/announcements/cars-dynatrace.md +++ /dev/null @@ -1,85 +0,0 @@ ---- -title: Cars.com and Dynatrace join the Foundation to support the stability and success of the Node.js platform -date: 2016-08-17T12:00:00.000Z -status: publish -category: Annoucements -slug: cars-dynatrace -layout: blog-post.hbs ---- - -> New Node.js Foundation Members Drive Enterprise Growth - -**SAN FRANCISCO, Aug. 17, 2016** — -[The Node.js Foundation](https://foundation.nodejs.org/), a community-led -and industry-backed consortium to advance the development of the Node.js -platform, today announced [Cars.com](https://www.cars.com/) and -[Dynatrace](https://www.dynatrace.com) have joined the Foundation as -Silver Members. - -An ecosystem of tools and services for enterprises is rapidly coalescing around -Node.js, giving it a boost as a universal platform powering everything from -enterprise applications, robots, API engines, cloud stacks and mobile websites. -The JavaScript runtime environment is resource-efficient, high-performing and -well-suited to scalability, making it an increasingly mainstream technology in -startups and enterprises alike for application development and microservice -architectures. Furthermore, Node.js ranks among the Top 10 languages for full -stack, frontend and backend developers surveyed by Stack Overflow in its 2016 -developer survey. - -More about today’s new Node.js Foundation members: - -Launched in 1998 and headquartered in Chicago, [Cars.com](https://www.cars.com/) -is a leading online destination that offers information from experts and -consumers to help car shoppers and owners buy, sell and service their vehicles. -The site offers millions of new and used vehicle listings, an extensive -database of consumer reviews, research and pricing tools, unbiased expert -content, and multiple options to sell a vehicle. Cars.com uses Node.js to help -scale its high-trafficked website to meet its users’ demands and service -expectations. - -“Prior to Node.js, we were using older content management solutions that were -not allowing us to effectively meet the demands of Cars.com, which receives -around 30 to 35 million visits to its web properties each month,” said Darrell -Pratt, director of software development at Cars.com. “Node.js provides the -necessary utility to allow us to grow and change to a microservice -infrastructure. There is so much potential in the future of Node.js through its -community, extensive libraries, and excellent tooling, and we are excited to -help fuel that growth through the Node.js Foundation.” - -[Dynatrace](https://www.dynatrace.com), a digital performance management -software company, believes Node.js is a core component for its enterprise -customers. As more enterprises begin to go through digital transformations and -implement microservice-type architectures, Node.js is a key technology that -ties frontend and backend systems together to provide an end-to-end view of -application performance with the help of Dynatrace offerings. - -“Digital transformation is constantly changing how companies do business, -something we see first-hand working directly with enterprises to meet their -performance monitoring needs,” said Alois Reitbauer, chief technology -strategist at Dynatrace. “Node.js very often plays a key role in their -transformation process, so we are excited to actively contribute to the Node.js -platform and help developers improve their monitoring capabilities.” - -“By joining the Node.js Foundation, companies are investing in the stability of -Node.js and helping to accelerate the widespread adoption and development of -Node.js,” said Mikeal Rogers, community manager at the Node.js Foundation. “New -members like Cars.com and Dynatrace are incredibly important to the -Foundation's work to develop services, education, training and events that -support the needs and demands of enterprise users.” - -The Node.js Foundation will be holding -[Node.js Interactive Europe](http://events.linuxfoundation.org/events/node-interactive-europe) in -Amsterdam September 15-18; and -[Node.js Interactive North America](http://events.linuxfoundation.org/events/node-interactive) in Austin, Texas, -November 29 - December 2. This is the only vendor-neutral event that offers an -in-depth look at the future of Node.js from -the developers who are driving the code forward. Node.js Core contributors will -offer insights into new developments, while enterprise users and vendors will -share best practices around tools, training and other services needed to -optimize Node.js. - -### About the Node.js Foundation - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 4 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy and NodeSource, and Silver members include Apigee, AppDynamics, Cars.com, Codefresh, DigitalOcean, Dynatrace, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD, and Yahoo!. Get involved here: https://nodejs.org. diff --git a/locale/fa/blog/announcements/foundation-advances-growth.md b/locale/fa/blog/announcements/foundation-advances-growth.md deleted file mode 100644 index b8dd72ae440f..000000000000 --- a/locale/fa/blog/announcements/foundation-advances-growth.md +++ /dev/null @@ -1,53 +0,0 @@ ---- -title: Node.js Foundation Advances Platform with More Than Three Million Users -date: 2015-12-08T12:00:00.000Z -status: publish -category: Annoucements -slug: foundation-advances-growth -layout: blog-post.hbs ---- - -> Node.js Platform Stronger Than Ever with New Node.js Foundation Members, -Community Contributions, and 100 Percent Year-Over-Year User Growth - -**NODE.JS INTERACTIVE 2015, PORTLAND, OR.** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today is announcing major community, code and membership growth, adoption statistics of the technology at large, and the Foundation’s new incubation program. - -The Node.js Foundation was founded in 2015 to accelerate the development of Node.js and support the large ecosystem that it encompasses through open governance. As part of this mission, the Foundation announced its first incubation project libuv. Libuv is a software library that provides asynchronous event notification and improves the Node.js programming experience. The project is both critical to Node.js and already widely used, making it a natural fit for the Foundation. Under the Foundation's umbrella, it will receive additional support and mentorship. - -The first Node.js Interactive event unites more than 700 developers, engineers, system architects, DevOps professionals and users representing a wide range of projects, products and companies in Portland, Ore. Node.js Interactive brings together a broad range of speakers to help experienced and novice Node.js users alike learn tips, best practices, new skills, as well as gain insight into future developments for the technology. With Node.js being used in 98% of the Fortune 500 companies regularly, the event will also highlight the maturation of the technology within enterprises. - -Attendees have the opportunity to see and learn more about how organizations like Capital One, GoDaddy, Intel, NodeSource, npm and Uber are using Node.js to meet their innovation needs. Attendees are also getting a first look at Node.js advancements announced and demoed this week including: - -* JOYENT: Joyent is announcing the 2016 Node.js Innovation Program, which provides Node.js expertise, marketing support and free cloud or on-premise infrastructure to start-ups and teams within larger enterprises that are driving innovation powered by Node.js. The 2015 program included [bitHound](https://www.bithound.io/), who will speak at Node.js Interactive about innovative approaches to identifying risk and priorities in dependencies and code.  More info can be found [here](https://www.joyent.com/innovation). -

Joyent also released a new Node.js, Docker and NoSQL reference [architecture](https://www.joyent.com/blog/how-to-dockerize-a-complete-application) that enables microservices in seconds. To learn more, the company will be demoing this at booth number 7. - -* IBM is featuring multiple Node.js based solutions for: a complete API lifecycle via StrongLoop Arc and Loopback.io; real-time location tracking in Node using Cloudant®  data services; how to write Node applications against Apache Spark; and end-to-end mobile applications using IBM MobileFirst -- all running on Bluemix®, IBM's Cloud Platform. - -* INTEL: A leader in the Internet of Things, Intel will be demoing a SmartHouse at booth number 0013. Based on the [IoTivity](https://www.iotivity.org/) open source project, which is sponsored by the [Open Interconnect Consortium (OIC)](http://openinterconnect.org/), the SmartHouse includes a home gateway from MinnowBoard Max client, three Edison controlled LEDs, fan, motion sensor, and smoke detector. Intel developed Node.js binding for IoTivity to power the demo with everything being controlled from a WebGL 3D virtual house interface. - -* NEARFORM: nearForm is holding a Node Expert Clinic for attendees who are looking for advice on Node.js adoption or struggling with any existing problems. Individuals will be connected to experts including Matteo Collina, Colin Ihrig, Wyatt Preul and Peter Elger for 30 minute sessions which can be arranged at the conference. -

In addition, the company is sharing real customer successes and adoption statistics of Node.js at large. The company gathered the data from 100 of their Node customers across the globe. The leading industries in implementation and adoption of Node.js include enterprise software companies and media companies. Financial, payment, travel, e-commerce and IoT tie for third in industries that are leading in both adoption and implementation. -

Startups are leading the way in adding Node.js into their strategy, but in 2013 and 2014 larger incumbents started to transition their stacks with Node.js as a core technology, notable names include PayPal, Condé Nast, and Costa. In terms of startup saturation: - - * 25% of developers at growth-stage companies in enterprise software are using Node.js; - * 25% of developers at FinTech startups are using Node.js; - * Healthcare startups are using Node.js in a significant way - an average of 33% of developers are using Node.js with the primary use-case to enable rapid-innovation; - * 48% of developers are using Node.js at IoT companies; - * 80% of developers at education startups are using the technology. - -* NODESOURCE: The company will showcase [N|Solid](https://nodesource.com/products/nsolid) for Enterprise-grade Node.js platform. It extends the capability of Node.js to provide increased developer productivity, protection of critical applications, and peak application performance. -

The company will also have [free upgrade tools](https://marketing.nodesource.com/acton/fs/blocks/showLandingPage/a/15680/p/p-001f/t/page/fm/4) available to help developers implement the latest Long Term Support version, v4. This version is essential for enterprises and companies using Node.js in more complex environments as it is the most stable and highly security code from the platform. - -* SYNCHRO LABS: Silver Node.js Interactive sponsor Synchro Labs announced the launch of Synchro platform, a new tool that allows enterprise developers to create high quality, high performance, cross-platform native mobile applications using the Node.js framework. The company is demoing the new platform at the conference booth 2. More information on the recent announcement [here](https://synchro.io/launch). - -### Community and Code Growth -Since the independent Node.js Foundation launched earlier this year, development progress continues to accelerate with dramatic increases in contributions to the project. In the past eight months, the community has grown from 8 to more than 400 contributors, with first-time contributors as high as 63 percent per month. There are more than 3 million active users of Node.js, which has increased 100% year over year. - -Currently the core Node.js repository includes 52 committers, while there’s been more than 709 contributors over time. Node.js core had 77 active contributors in October alone, with 46 of those being first-time contributors. More than 400 developers have commit rights to some part of the Node.js project currently. The community is focused on creating a new type of open source contribution philosophy called participatory governance, which liberalizes contribution policies and provides direct ownership to contributors. - -In addition, the Foundation announced three new Silver members to the team, Apigee, RisingStack, and Yahoo. You can find details of the new membership here. - -#### About Node.js Foundation - -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), YLD!, and Yahoo!. Get involved here: [https://nodejs.org](https://nodejs.org). - diff --git a/locale/fa/blog/announcements/foundation-elects-board.md b/locale/fa/blog/announcements/foundation-elects-board.md deleted file mode 100644 index 664c7acae27e..000000000000 --- a/locale/fa/blog/announcements/foundation-elects-board.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -title: Node.js Foundation Elects Board of Directors -date: 2015-09-04T21:00:00.000Z -status: publish -category: Annoucements -slug: foundation-elects-board -layout: blog-post.hbs ---- - -> New Foundation Committed to Accelerating Growth of the Node.js Platform Also Adds Marketing Chair and Community Manager - -SAN FRANCISCO, Sept. 4, 2015 – The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced key executives have been elected to its Board of Directors. The Board of Directors represents the broad Node.js community and will guide the Foundation as it executes on its mission to enable widespread adoption and help accelerate development of Node.js and other related modules. - -The Node.js Foundation board, which sets the business and technical direction as well as oversees IP management, marketing, and events on behalf of the organization, includes: - -* [Danese Cooper](https://www.linkedin.com/in/danesecooper), chairman of the board, distinguished member of technical staff - open source at PayPal; -* [Scott Hammond](https://www.linkedin.com/pub/scott-hammond/1/a4b/92a), vice-chairman of the board, chief executive officer at Joyent; -* [Brian McCallister](https://www.linkedin.com/in/brianmccallister), silver-level director of the board, chief technology officer of platforms at Groupon; -* [Todd Moore](https://www.linkedin.com/pub/todd-moore/2b/540/798), board member, vice president of open technology at IBM; -* [Steve Newcomb](https://www.linkedin.com/in/stevenewcomb), board member, founder and chief executive officer at Famous Industries; -* [Gianugo Rabellino](https://www.linkedin.com/in/gianugo), secretary of the board, senior director of open source programs at Microsoft; -* [Charlie Robbins](https://www.linkedin.com/in/charlierobbins), gold-level director of the board, director of engineering at GoDaddy.com; -* [Imad Sousou](https://www.linkedin.com/pub/imad-sousou/6/b49/2b8), board member, vice president and general manager at Intel; -* [Rod Vagg](https://www.linkedin.com/in/rvagg), technical steering committee chairperson, chief node officer at NodeSource. - -In addition to formalizing the board, [Bill Fine](https://www.linkedin.com/pub/bill-fine/2/497/916), vice president of product and marketing at Joyent, was elected as the marketing chairperson. The Linux Foundation also hired [Mikeal Rogers](https://www.linkedin.com/in/mikealrogers) as its community manager to help support and guide the new organization. - -“The new board members represent the diversity of the Node.js community and the commitment that these companies have to supporting its overall efforts,” said Danese Cooper, chairman of the board, Node.js Foundation. “Node.js is incredibly important to the developer ecosystem and is increasingly relevant for building applications on devices that are changing the pace of commerce. The board will work to support and build the Node.js platform using the blueprint of an open governance model that is transparent and supportive of its community.” - -In early June, the Node.js and io.js developer community announced that they were merging their respective code base to continue their work in a neutral forum, the Node.js Foundation. The new leaders will help support the ongoing growth and evolution of the combined communities and will foster a collaborative environment to accelerate growth and the platform’s evolution. - -### About Node.js Foundation - -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft and PayPal. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, Sauce Labs, SAP, StrongLoop and YLD!. Get involved here: [https://nodejs.org](https://nodejs.org). - -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [www.linuxfoundation.org](http://www.linuxfoundation.org) diff --git a/locale/fa/blog/announcements/foundation-express-news.md b/locale/fa/blog/announcements/foundation-express-news.md deleted file mode 100644 index 1da42450bf97..000000000000 --- a/locale/fa/blog/announcements/foundation-express-news.md +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Node.js Foundation to Add Express to its Incubator Program -date: 2016-02-10T21:00:00.000Z -category: Annoucements -slug: Express as Incubator Project -layout: blog-post.hbs ---- - -> Node.js Foundation to Add Express to its Incubator Program - -SAN FRANCISCO, Feb. 10, 2016 — The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced Express, the most popular Node.js web server framework, and some of its constituent modules are on track to become a new incubation project of the Foundation. - -With [53+ million downloads in the last two years](http://npm-stat.com/charts.html?package=express&author=&from=&to=), Express has become one of the key toolkits for building web applications and its stability is essential for many Node.js users, especially those that are just getting started with the platform. Express also underpins some of the most significant projects that support Node.js, including [kraken.js](http://krakenjs.com/), a secure and scalable layer that extends Express and is heavily used by enterprises. Kraken.js was open sourced [by PayPal in 2014](https://www.paypal-engineering.com/2014/03/03/open-sourcing-kraken-js/). It also underpins [Sails.js](http://sailsjs.org/), a web framework that makes it easy to build custom, enterprise-grade Node.js apps, and [Loopback](http://loopback.io/), a Node.js API framework. - -“This framework is critical to a significant portion of many Node.js users,” said Mikeal Rogers, Community Manager of the Node.js Foundation. “Bringing this project into the Node.js Foundation, under open governance, will allow it to continue to be a dependable choice for many enterprises and users, while ensuring that we retain a healthy ecosystem of competing approaches to solving problems that Express addresses.” - -"The work around developing and maintaining Express has been a tremendous asset to the community," said Rod Vagg, Chief Node Officer at NodeSource and Technical Steering Committee Director of the Node.js Foundation. "With 5 million package downloads in the last month, the stability of this project, that will get a huge boost through open governance, is very important to the efforts of the Node.js Foundation in supporting Node.js as a technology and developer ecosystem." - -“IBM is committed not only to growing and supporting the Node.js ecosystem, but to promoting open governance for the frameworks that enable Node.js developers to work smarter, faster and with more agility,” said Todd Moore, IBM, VP Open Technology. “We are thrilled that Express is being introduced as an incubated top level project of the Foundation. Express has a bright future and a new long term home that will ensure resources, reliability and relevancy of Express to the global Node.js developer community.” - -Assets related to Express are being contributed to the Node.js Foundation by IBM. - -The Node.js Foundation Incubator Program was launched last year. Projects under the Node.js Foundation Incubator Program receive assistance and governance mentorship from the Foundation's Technical Steering Committee and related working groups. The Incubator Program is intended to support the many needs of Node.js users to maintain a competitive and robust ecosystem. - -### About Node.js Foundation - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 3 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), YLD!, and Yahoo!. Get involved here: https://nodejs.org. - -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [www.linuxfoundation.org](http://www.linuxfoundation.org) diff --git a/locale/fa/blog/announcements/foundation-v4-announce.md b/locale/fa/blog/announcements/foundation-v4-announce.md deleted file mode 100644 index ec9c97e36bd3..000000000000 --- a/locale/fa/blog/announcements/foundation-v4-announce.md +++ /dev/null @@ -1,45 +0,0 @@ ---- -title: Node.js Foundation Combines Node.js and io.js Into Single Codebase in New Release -date: 2015-09-14T17:00:00.000Z -status: publish -category: Annoucements -slug: foundation-v4-announce -layout: blog-post.hbs ---- - -More Stability, Security, and Improved Test Coverage Appeals to Growing Number of Enterprises Using Node.js - -SAN FRANCISCO, Sept. 14, 2015 – The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced the release of Node.js version 4.0.0. A record number of individuals and companies helped to contribute to the release, which combines both the Node.js project and io.js project in a single codebase under the direction of the Node.js Foundation. - -Currently, Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. With major stability and security updates, a new test cluster, support for ARM processors and long-term support, Node.js v4 represents the latest framework innovation for enterprise users leveraging it to run JavaScript programs. - -Named version 4.0.0 because it includes major updates from io.js version 3.0.0, the new release also contains V8 v4.5, the same version of V8 shipping with the Chrome web browser today. This brings with it many bonuses for Node.js users, most notably a raft of new [ES6](https://nodejs.org/en/docs/es6/) features that are enabled by default including block scoping, classes, typed arrays (Node's Buffer is now backed by Uint8Array), generators, Promises, Symbols, template strings, collections (Map, Set, etc.) and new to V8 v4.5, arrow functions. - -Node.js v4 also brings a plan for [long-term support (LTS)](https://github.com/nodejs/LTS/) and a regular release cycle. Release versioning now follows the Semantic Versioning Specification, a specification for version numbers of software libraries and similar dependencies, so expect increments of both minor and patch version over the coming weeks as bugs are fixed and features are added. The LTS will support enterprise users that need more long-term requirements and continue the innovation and work with the V8 team to ensure that Node.js continues to evolve. - -"Under the Node.js Foundation, our unified community has made incredibly progress in developing a converged codebase,” said Mikeal Rogers, Community Manager of The Node.js Foundation. “We believe that the new release and LTS cycles allow the project to continue its innovation and adopt cutting-edge JavaScript features, while also serving the need for predictable long-term stability and security demanded by a growing number of enterprise users who are proudly adopting Node.js as a key technology.” - -Additional updates include: - -* **Stability and Security**: Key Node.js Foundation members, such as IBM, NodeSource and StrongLoop, contributed a strong enterprise-focus to the latest release. Their contributions make this latest version more stable and secure for enterprise needs. -* **Improved Platform Test Coverage**: With the assistance of some major partners, including RackSpace, DigitalOcean, Scaleway and ARM Holdings, the new release has built one of the most advanced testing clusters of any major open source project creating additional stability to the platform. -* **First-Class Coverage of ARM variants**: All major ARM variants, ARMv6, ARMv7, and the brand new 64-bit ARMv8, which is making major inroads in the server market, are supported as part of the test infrastructure. Developers who need to use these architectures for developing enterprise-ready and IoT applications are assured solid runtime. -* **Addition of Arrow Functions**: Node.js v4 now includes arrow functions, an addition that was not previously available even in io.js. - -The technical steering committee for the Node.js Foundation is now 15 members strong with 40 plus core committers and 350+ GitHub organization members contributing to the community. The development process and release cycles are much faster due to the large, active community united under the Node.js Foundation umbrella. The next release is planned before the end of 2015. In parallel, the project will be branching a new stable line of releases every six months, with one planned in October and another for spring of 2016. - -Additional Resources -* Technical Blog - [Node v4.0.0 (Stable)](https://nodejs.org/en/blog/release/v4.0.0/) -* New GitHub [home](https://github.com/nodejs/node) - -About Node.js Foundation -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft and PayPal. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, Sauce Labs, SAP, StrongLoop and YLD!. Get involved here: [https://nodejs.org](https://nodejs.org). -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [https://foundation.nodejs.org/](https://foundation.nodejs.org/) - -> Node.js Foundation is a licensed mark of Node.js Foundation. Node.js is a trademark of Joyent, Inc. and is used with its permission - -Media Contact -Node.js Foundation -Sarah Conway -978-578-5300 -pr@nodejs.org diff --git a/locale/fa/blog/announcements/index.md b/locale/fa/blog/announcements/index.md deleted file mode 100644 index 85cc53467b51..000000000000 --- a/locale/fa/blog/announcements/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Announcements -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/announcements/interactive-2015-keynotes.md b/locale/fa/blog/announcements/interactive-2015-keynotes.md deleted file mode 100644 index ecc871e892fb..000000000000 --- a/locale/fa/blog/announcements/interactive-2015-keynotes.md +++ /dev/null @@ -1,62 +0,0 @@ ---- -title: Keynotes for Node.js Interactive 2015 Announced -date: 2015-11-20T09:00:00.000Z -status: publish -category: Annoucements -slug: interactive-2015-programming -layout: blog-post.hbs ---- - -> Keynotes from GoDaddy, IBM, NodeSource, Uber and More Featured At Inaugural Node.js Foundation Event December 8-9, 2015, in Portland, Ore. - -**SAN FRANCISCO, Nov. 20, 2015** – The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced the final keynotes and programming for [Node.js Interactive](http://events.linuxfoundation.org/events/node-interactive). The event will feature conversations and presentations on everything from the future of Node.js in IoT to collaborations between the community and the enterprise. - -Node.js is the runtime of choice for building mobile, web and cloud applications. The diversity of the technology and its capabilities are making it ubiquitous in almost every ecosystem from personal finance to robotics. To highlight changes with the platform and what’s to come, Node.js Interactive will focus on three tracks: Frontend, Backend and the Internet of Things (IoT). Highlights of these tracks available [here](https://nodejs.org/en/blog/announcements/interactive-2015-programming/); full track sessions [here](http://events.linuxfoundation.org/events/node-interactive/program/schedule). - -Node.js Interactive brings together a broad range of speakers to help experienced and novice Node.js users alike learn tips, best practices, new skills, as well as gain insight into future developments for the technology. - -2015 Node.js Interactive keynotes include: - -### Day 1, December 8, 2015 - -* Jason Gartner, Vice President, WebSphere Foundation and PureApplication Dev at IBM -* James Snell, Open Technologies at IBM, “Convergence: Evolving Node.js with Open Governance and an Open Community” -* Joe McCann, Co-Founder and CEO at NodeSource, “Enterprise Adoptions Rates and How They Benefit the Community” -* Ashley Williams, Developer Community and Content Manager at npm -* Tom Croucher, Engineer Manager at Uber, “Node.js at Uber” - -### Day 2, December 9, 2015 - -* Mikeal Rogers, Node.js Foundation Community Manager at The Linux Foundation, “Node.js Foundation Growth and Goals” -* Danese Cooper, Distinguished Member of Technical Staff - Open Source at PayPal and Node.js Foundation Chairperson -* Panel Discussion with Node.js Foundation [Technical Steering Committee members](https://foundation.nodejs.org/tsc/) - -In addition to keynotes, Node.js Foundation will have breakout sessions and panels discussing how Node.js is used in some of the largest and fastest growing organizations. - -**These include:** - -* Robert Schultz, Applications Architect at Ancestry -* Azat Mardan, Technology Fellow at Capital One -* Charlie Robbins, Director of Engineering UX Platform at GoDaddy -* Chris Saint-Amant, Director of UI Engineering at Netflix -* Kim Trott, Director of UI Platform Engineering at Netflix -* Bill Scott, VP of Next Generation Commerce at PayPal -* Panel - APIs in Node.js with GoDaddy, Symantec, and StrongLoop Inc. -* Panel - Node.js and Docker with Joyent, Ancestry and nearForm -* Panel - Node.js in the Media with Condé Nast, Mic and Bloomberg - -“Our list of speakers and breakout sessions are a great way to dive head first into Node.js, no matter if you are new to the platform or an expert,” said Mikeal Rogers, Community Manager, Node.js Foundation. “It is a great way for the community to come together, learn, share and better understand where the technology is heading in the future. The case studies, keynotes and breakout sessions showcased at the event shows how rapidly Node.js is growing in the enterprise.” - -Standard registration closes November 27, 2015, after which the conference price will increase from $425 to $525. To register visit [https://www.regonline.com/Register/Checkin.aspx?EventID=1753707](https://www.regonline.com/Register/Checkin.aspx?EventID=1753707). - -Node.js Interactive is made possible by Platinum sponsor IBM; Gold sponsors: Joyent, Microsoft, Modulus Inc., Red Hat; and Silver sponsors NodeSource, nearForm and npm and Synchro. - -### Additional Resources - -* Learn more about the [Node.js Foundation](https://foundation.nodejs.org/), and get involved with the [project](https://nodejs.org/en/get-involved/). -* Want to keep abreast of Node.js Foundation news? Sign up for our newsletter at the bottom of the [Node.js Foundation page](https://foundation.nodejs.org/). -* Follow on [Twitter](https://twitter.com/nodejs?ref_src=twsrc^google|twcamp^serp|twgr^author) and [Google+](https://plus.google.com/u/1/100598160817214911030/posts). - -About Node.js Foundation -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, Sauce Labs, SAP, and YLD!. Get involved here: [https://nodejs.org](https://nodejs.org). -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [https://foundation.nodejs.org/](https://foundation.nodejs.org/) diff --git a/locale/fa/blog/announcements/interactive-2015-programming.md b/locale/fa/blog/announcements/interactive-2015-programming.md deleted file mode 100644 index 53f4af814067..000000000000 --- a/locale/fa/blog/announcements/interactive-2015-programming.md +++ /dev/null @@ -1,60 +0,0 @@ ---- -title: Node.js Foundation Announces Programming For Node.js Interactive -date: 2015-10-20T17:00:00.000Z -status: publish -category: Annoucements -slug: interactive-2015-programming -layout: blog-post.hbs ---- - -> Inaugural Conference to Advance the Use of Node.js Within Backend, Frontend, IoT Applications - -SAN FRANCISCO, Oct. 20, 2015 – [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced initial programming for [Node.js Interactive](http://events.linuxfoundation.org/events/node-interactive). This inaugural event, which is being led by the newly formed Node.js Foundation in cooperation with the Linux Foundation, will be held December 8-9, 2015, in Portland, Ore. - -Node.js has become ubiquitous in almost every ecosystem in technology and is consistently being used more in mainstream enterprises. To continue to evolve the platform, Node.js Interactive brings together a wide range of community, projects, products and companies to create an educational and collaborative space. With more than 700 attendees expected, Node.js Interactive will provide a way to network with other developers and engineers within this diverse community. - -Node.js Interactive will also focus on three tracks: Frontend, Backend and the Internet of Things (IoT); talks for each track were selected in collaboration with track chairs [Jessica Lord](https://github.com/jlord/) (Frontend), [C J Silvero](https://github.com/ceejbot) (Backend) and [Kassandra Perch](https://github.com/nodebotanist) (IoT). A few highlights include: - -Frontend Session Highlights: -* JavaScript, For Science! *with* Max Ogden, Computer Programmer for Dat Project -* Making Your Node.js Applications Debuggable *with* Patrick Mueller, Senior Node Engineer at NodeSource -* Node Intl: Where We Are, What's Next *with* Steven Loomis, Senior Software Engineer at IBM -* Rapid Development of Data Mining Applications in Node.js *with* Blaz Fortuna, Research Consultant for Bloomberg L.P., Senior Researcher at Jožef Stefan Institute and Partner at Quintelligence -* Real-Time Collaboration Sync Strategies *with* Todd Kennedy, CTO of Scripto -* Rebuilding the Ship as It Sails: Making Large Legacy Sites Responsive *with* Philip James, Senior Software Engineer at Eventbrite - -Backend Session Highlights: -* Building and Engaging High-Performance Teams in the Node.js Ecosystem *with* Chanda Dharap, Director of Engineering at StrongLoop, an IBM company -* Microservice Developer Experience *with* Peter Elger, Director of Engineering at nearForm -* Modernizing Winston for Node.js v4 *with* Charlie Robbins, Director of Engineering UX Platform at GoDaddy -* Node.js API Pitfalls, Can You Spot Them? *with* Sam Roberts, Node/Ops Developer at StrongLoop, an IBM Company -* Node.js Performance Optimization Case Study *with* Bryce Baril, Senior Node Engineer at NodeSource -* Resource Management in Node.js *with* Bradley Meck, Software Engineer at NodeSource - -IoT Session Highlights: -* Contributing to Node Core *with* Jeremiah Senkpiel, Node Core Contributor at NodeSource -* Hands on Hardware Workshop *with* Tessel with Kelsey Breseman, Engineering Project Manager at 3D Robotics and Steering Committee Member and Board Co-Creator of Tessel Project -* Internet of Cats *with* Rachel White, Front-End Engineer for IBM Watson -* IoT && Node.js && You *with* Emily Rose, Senior Software Engineer at Particle IO -* Node.s Bots at Scale *with* Matteo Collina, Architect at nearForm -* Node.js Development for the Next Generation of IoT *with* Melissa Evers-Hood, Software Product Line Manager at Intel Corporation -* Node.js While Crafting: Make Textile to Compute! *with* Mariko Kosaka, JavaScript Engineer at Scripto - -“Node.js has become pervasive within the last few years, with so many community accomplishments to highlight, including forming the new Node.js Foundation and the convergence of io.js and node.js,” said Mikeal Rogers, Community Manager, Node.js Foundation. “We created this conference to help showcase this growth, to accommodate the Node.js community’s many different needs, and to help accelerate adoption as it expands into enterprises.” - -Early bird registration ends October 23, 2015. Standard registration closes November 21, 2015, after which the conference price will increase from $425 to $525. Discounted hotel rates are also available until Wednesday, November 11, 2015. To register visit [https://www.regonline.com/Register/Checkin.aspx?EventID=1753707](https://www.regonline.com/Register/Checkin.aspx?EventID=1753707). - -Node.js Interactive is made possible by platinum sponsor IBM, gold sponsor Microsoft, and silver sponsors NodeSource and nearForm. - -Additional panels and keynotes will be announced in the coming weeks; to see the initial program visit: [http://nodejspdx2015.sched.org](http://nodejspdx2015.sched.org). For more information visit [http://events.linuxfoundation.org/events/node-interactive](http://events.linuxfoundation.org/events/node-interactive). - -Additional Resources - -Learn more about the [Node.js Foundation](https://foundation.nodejs.org/), and get involved with [the project](https://nodejs.org/en/get-involved/). -Want to keep abreast of Node.js Foundation news? Sign up for our newsletter at the bottom of the [Node.js Foundation page](https://foundation.nodejs.org/). -Follow on [Twitter](https://twitter.com/nodejs?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor) and [Google+](https://plus.google.com/u/1/100598160817214911030/posts). - -About Node.js Foundation
Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, Rising Stack, Sauce Labs, SAP, and YLD!. Get involved here: [https://nodejs.org](https://nodejs.org). -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [https://foundation.nodejs.org/](https://foundation.nodejs.org/) diff --git a/locale/fa/blog/announcements/interactive-2015.md b/locale/fa/blog/announcements/interactive-2015.md deleted file mode 100644 index f474a337e567..000000000000 --- a/locale/fa/blog/announcements/interactive-2015.md +++ /dev/null @@ -1,26 +0,0 @@ ---- -title: Node.js Interactive -date: 2015-09-10T17:00:00.000Z -status: publish -category: Annoucements -slug: interactive-2015 -layout: blog-post.hbs ---- -Are You Ready for Node.js Interactive? - -The Node.js Foundation is pleased to announce [Node.js Interactive](http://interactive.nodejs.org) happening from December 8-9, 2015 in Portland, OR. With node.js growing in all aspects of technology, the gathering will cover everything from streamlining development of fast websites and real-time applications to tips for managing node.js applications, and much more. - -The event will be the first of its kind under the Node.js Foundation led in cooperation with The Linux Foundation. Vendor-neutral by design, it will focus on the continued ideals of open governance collaboration between the now joined node.js and io.js community. The conference welcomes experienced developers as well as those interested in how node.js might be of use to their business with tracks that focus on IoT, front-end and back-end technologies. To curate these tracks and create the best experience for attendees, track chairs include seasoned veterans: - -* [Kassandra Perch](https://github.com/nodebotanist) for IoT, a software developer / evangelist / advocate / educator / roboticist living in Austin, TX, who you can follow at: [@nodebotanist](https://twitter.com/nodebotanist). -* [Jessica Lord](https://github.com/jlord/) for Front-End, a GitHub developer and designer who loves open source, JavaScript & node.js, and stories of Tudor England and is a Portland transplant. -* [C J Silverio](https://github.com/ceejbot) for Back-End, who is all node, all the time and works as VP of engineering at npm, Inc. in the Bay area. - -As the Node.js community continues to grow, the Node.js Foundation believes this event is the perfect place to continue to develop collaboration and better understand what’s next for this extremely popular technology. Interested in joining us? Register [here](http://events.linuxfoundation.org/events/node-interactive/attend/register). Timeline for discount rates are as follows: - -* Super Early Bird - US$200 for the 1st 100 tickets -* Early Bird - US$325, ends October 17 -* Standard - US$425, ends November 21 -* Late & Onsite - US$525, begins November 22 - -If you are interested in becoming a speaker, please check out our [Call For Participation](http://events.linuxfoundation.org/events/node-interactive/program/cfp) page for more details. Call for Participation closes on September 24, 2015. diff --git a/locale/fa/blog/announcements/interactive-2016-ams.md b/locale/fa/blog/announcements/interactive-2016-ams.md deleted file mode 100644 index 0322325de07d..000000000000 --- a/locale/fa/blog/announcements/interactive-2016-ams.md +++ /dev/null @@ -1,84 +0,0 @@ ---- -title: Node.js Foundation Announces Keynotes and Programming for Node.js Interactive Europe -date: 2016-06-29T12:00:00.000Z -status: publish -category: Annoucements -slug: interactive-2016-ams-programming -layout: blog-post.hbs ---- - -> Event provides neutral forum for learning about the future of Node.js and JavaScript from the community and enterprise alike - -**SAN FRANCISCO, June 29, 2016** – [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced the initial programming for Node.js Interactive Europe, September 15 -16, 2016, in Amsterdam, Netherlands. The event will showcase workshops, community and technical talks, and use cases that will inform the future development of Node.js and JavaScript. - -With 4 million users a month and adoption across numerous industries, Node.js is emerging as a universal platform used for web applications, IoT, enterprise application development and microservice architectures. This marquee event attracts enterprise users, developers and community stakeholders, providing them with a unique opportunity for cross-disciplinary discussions that are aimed to provide new insights and new opportunities around Node.js development. - -“We’ve hand-selected a range of presenters and content that will showcase the future of Node.js and how pervasive it has become in the market through both a community and enterprise lens,” said Mikeal Rogers, community manager of Node.js Foundation. “This is a perfect conference if you are a front end, back end, mobile, IoT or full stack developer.” - -The keynotes will focus on the future of Node.js and corresponding technologies. The initial keynotes include: - -* **Ashley Williams**, Node.js Foundation community board chair, founder of NodeTogether, and developer community and content manager at npm -* **Doug Wilson**, Express lead maintainer -* **James Snell**, IBM engineer and Node.js Foundation TSC member -* **Kat Marchán**, CLI engineer at npm -* **Mikeal Rogers**, community manager at the Node.js Foundation - -Experts from the world’s leading companies and most important open source projects will deep dive into tracks ranging from artificial intelligence to security. A sampling of this year’s sessions include: - -## Cloud and Back End - -* Node.js and Containers go together like Peanut Butter and Jelly from **Ross Kukulinski of NodeSource** -* Building the Node.JS Global Distribution Network from **Guillermo Rauch creator of Socket.io** -* SWIMming in the microservices Ocean from **Luca Maraschi of Sporti and nearForm** - -## Diagnosing, Debugging, and DevOps - -* Instrumentation and Tracing in Node.js from **Thomas Watson of Opbeat** -* The Cost of Logging from **Matteo Collina of nearForm** - -## Machine Learning, Big Data, Artificial Intelligence - -* Taking on Genetically Evolving Cellular Automata with JavaScript from **Irina Shestak of Small Media Foundation** -* From Pterodactyls and Cactus to Artificial Intelligence by **Ivan Seidel of Tenda Digital** - -## Node.js Core - -* Keeping the Node.js Community Infrastructure Humming: An Update from the Build Workgroup from **Michael Dawson of IBM** -* Creating Native Addons - General Principles from **Gabriel Schulhof of Intel** -* The CITGM Diaries from **Myles Borins of IBM** - -## Security - -* FIPS Comes to Node.js from **Stefan Budeanu of IBM** -* Take Data Validation Seriously from **Paul Milham of WildWorks** - -## IoT - -* Node.js on Hardware: Where We Are, Where We're Going, and How We'll Get There from **Kassandra Perch of NodeBots** -* Why did the robot cross the road? Computer vision, robots and mobile games from **Pawel Szymczykowski of Wedgies** -* The Future is Now: How to Realize your New Potential as a Cyborg from **Emily Rose of Salesforce** - -## Node.js Everywhere - -* Bitcoin, Blockchain and Node from **Portia Burton of The Atlantic** -* Node.js and the African Market from **Ogatcha Prudence of Pilby** -* The Radical Modularity from **Aria Stewart of npm** - -## Workshops - -* Deploying Node.js applications using plain old Linux from **Luke Bond of YLD** -* Build a real-time multiplayer chess game with Socket.io from **David Washington of Microsoft** -* Isomorphic JavaScript with React + Express from **Azat Mardan of Capital One** - -The event will provide free onsite childcare for attendees and offers ASL, interpretation and transcription assistance upon request. The Node.js Foundation is offering three diversity scholarships this year. More information can be found [here](http://events.linuxfoundation.org/events/node-interactive-europe/attend/diversity-scholarship). - -Node.js Interactive is made possible by support from Platinum Sponsor IBM; Gold Sponsors nearForm and YLD; and Silver Sponsor Opbeat. If you are interested in sponsoring please contact Todd Benzies at tbenzies@linuxfoundation.org. - -For attendees who register before July 4th, the early bird registration fee is $400. Visit [here](https://www.regonline.com/Register/Checkin.aspx?EventID=1811779) to register. - -About the Node.js Foundation - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 4 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy and NodeSource, and Silver members include Apigee, AppDynamics, Codefresh, DigitalOcean, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD, and Yahoo!. Get involved here: [https://nodejs.org](https://nodejs.org). - diff --git a/locale/fa/blog/announcements/interactive-2016-north-america-schedule.md b/locale/fa/blog/announcements/interactive-2016-north-america-schedule.md deleted file mode 100644 index 25aea3678f72..000000000000 --- a/locale/fa/blog/announcements/interactive-2016-north-america-schedule.md +++ /dev/null @@ -1,103 +0,0 @@ ---- -title: Node.js Foundation Announces Schedule for Second Annual Node.js Interactive North America -date: 2016-09-12T16:00:00.000Z -status: publish -category: Annoucements -slug: interactive-2016-north-america-schedule -layout: blog-post.hbs ---- - -> IBM, Netflix, Microsoft, and leading community experts to showcase the current and future of Node.js - -**SAN FRANCISCO, September 12, 2016** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced the keynotes and agenda for Node.js Interactive North America, November 29 - December 2, 2016, in Austin, TX. The event attracts enterprise users, developers, and community stakeholders, providing them with the tools and training they need to optimize the Node.js platform. - -With almost 5 million users a month and adoption across numerous industries, Node.js is a universal platform for web applications, IoT, enterprise application development, and microservice architectures. Its liberal contribution policies have also allowed the platform to increase the number of contributors working on the project by a sustained 100% year-over-year growth for the last several years. - -Node.js Interactive offers a unique mix of skill-building and knowledge-sharing sessions, panels, and workshops to help developers accelerate their use of Node.js. It is the only vendor-neutral event that offers the community better insight into Node.js and its working groups, combined with best practices and instruction on how to improve performance, debugging, security, and tooling for mainstream enterprise users. - -The event is designed to appeal to both experienced and new developers and architects with a NodeSchool event happening during the conferences. Additionally, NodeTogether, a beginner tutorial event that also aims to improve diversity within the Node.js community, is holding a special teachers training at the event. - -“Node.js has clearly become a high-priority platform for digital transformation. Node.js Interactive will have a roster of expert technical and community speakers who will discuss the future of Node.js and new growth in areas like artificial intelligence, cloud native architectures, container-packaged applications and more,” said Mikeal Rogers, community manager of the Node.js Foundation. - -Keynotes for the conference will provide an update and future look at Node.js and its growing ecosystem of related modules. A sampling of keynotes includes: - -- “The Road Forward on Education and Diversity” - Tracy Hinds of the Node.js Foundation and Emily Rose of Salesforce -- “npm State of the Union” - Ashley Williams of npm -- “Node.js State of the Union” - Rod Vagg of NodeSource and Technical Steering Committee Director of Node.js Foundation -- “Express State of the Union” - Doug Wilson, Express lead maintainer - -Experts from the leading open source projects and enterprises will share their expertise with Node.js and JavaScript in areas ranging from artificial intelligence to full stack development. Highlights include: - -### Node.js Everywhere - -- “Node.js Releases, How Do They Work?” - Myles Borins of IBM -- “Slaying Monoliths with Docker and Node.js” - Yunong Xiao of Netflix -- “Instrumentation and Tracing in Node.js” - Thomas Watson of Opbeat -- “Surviving Web Security Using Node.js” - Gergely Nemeth of RisingStack -- “Writing Secure Node Code: Understanding and Avoiding the Most Common Node.js Security Mistakes” - Guy Podjarny of Snyk - -### Cloud and Back End - -- “Hitchhiker’s Guide to ‘Serverless’ JavaScript” - Steven Faulkner of Bustle -- “Making Magic in the Cloud with Node.js at Google” - Justin Beckwith of Google -- “Buzzword Bingo: Architecting a Cloud-Native Internet Time Machine” - Ross Kukulinski of NodeSource - -### Diagnosing, Debugging, and DevOps - -- “Building and Shipping Node.js Apps with Docker” - Mano Marks of Docker -- “The Morality of Code” - Glen Goodwin of SAS Institute, Inc. - -### Machine Learning, Big Data, Artificial Intelligence - -- “Real-Time Machine Learning with Node.js” - Phillip Burckhardt of Carnegie Mellon University -- “Math in V8 is Broken and How We Can Fix It” - Athan Reines of Fourier - -### Node.js Core - -- “Contributing to Node.js: Coding Not Required” - William Kapke of Kap Co, LLC -- “A Beginner’s Guide To Reading Node.js Core Source” - Rich Trott of University of California, San Francisco -- “Node.js and ChakraCore” - Arunesh Chandra of Microsoft -- “Implementing HTTP/2 for Node.js Core” - James Snell of IBM - -### The New Full Stack - -- “Serverless Front-End Deployments using npm” - Charlie Robbins of GoDaddy -- “API Design Through the Lens of Photography” - Bryan Hughes of Microsoft -- “JavaScript will Let Your Site Work without JavaScript” - Sarah Meyer of Buzzfeed -- “Nodifying the Enterprise” - Shweta Sharma from To The New -- “Full Stack Testing with Node.js” - Stacy Kirk of Quality Works - -### IoT - -- “IoT & Developer Happiness” - Emily Rose of Salesforce -- “Taking on Genetically Evolving Cellular Automata with JavaScript” - Irina Shestak of Small Media Foundation - -### Operations and Performance - -- “Scaling State” - Matteo Collina of nearForm -- “Don't Let Just Node.js Take the Blame!” - Daniel Khan of Dynatrace - -### Workshops - -- “Games as Conversational Interfaces” - Kevin Zurawel of Braintree -- “Agile Security for Web Developers” - Kim Carter of BinaryMist -- “Science Meets Industry: Online Behavioral Experiments with nodeGame” - Stefano Balietti of Northeastern University -- “Building Desktop Applications With Node.js Using Electron” - Steve Kinney of Turing School of Software and Design - -Free onsite childcare for attendees is available as well as ASL, interpretation and transcription assistance upon request. Please email events@node.js for more information. - -Node.js Interactive is made possible by support from Platinum Sponsors IBM and Google Cloud Platform; Gold Sponsors nearForm and NodeSource; Silver Sponsors GoDaddy, Langa, Opbeat, Rollbar, and Sauce Labs; and Bronze Sponsors Codiscope, Sqreen, and Stormpath. If you are interested in sponsoring please contact Todd Benzies at tbenzies@linuxfoundation.org. - -For attendees who register before November 14, the standard registration fee is $600; registration increases to $800 after November 14. Visit [here](http://events.linuxfoundation.org/events/node-interactive) to register. - -### About the Node.js Foundation - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 4.5 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks, and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members IBM, Intel, Joyent, Microsoft, PayPal, and Red Hat. Gold members include GoDaddy and NodeSource, and Silver members include Apigee, AppDynamics, Cars.com, Codefresh, DigitalOcean, Dynatrace, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD, and Yahoo!. Get involved here: https://nodejs.org. - -### Media Contact - -Zibby Keaton -Node.js Foundation -zkeaton@linuxfoundation.org diff --git a/locale/fa/blog/announcements/nodejs-certified-developer-program.md b/locale/fa/blog/announcements/nodejs-certified-developer-program.md deleted file mode 100644 index 915929f86b87..000000000000 --- a/locale/fa/blog/announcements/nodejs-certified-developer-program.md +++ /dev/null @@ -1,57 +0,0 @@ ---- -title: The Node.js Foundation Partners with The Linux Foundation on New Node.js Certification Program -date: 2017-1-26T12:00:00.000Z -category: Annoucements -slug: nodejs-certified-developer-program.md -layout: blog-post.hbs ---- - -_Node.js Foundation to launch -vendor-neutral certification program for fastest growing platform _ - - **SAN FRANCISCO, Jan. 26, 2017** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and -industry-backed consortium to advance the development of the Node.js platform, -today announced development of the inaugural Node.js certification program -aimed to establish a baseline competency in Node.js. - -Node.js is one of the most popular programming languages with more than 4.5 million -active users per month. While Node.js is increasingly pervasive with -enterprises of all sizes today, organizations that are eager to expand their -use of Node.js often struggle when retraining Java developers and recruiting -new talent. - -“The Node.js Foundation, with help from incredible community members and core -experts, is creating a comprehensive certification program that broadens the -funnel of skilled Node.js expertise available. Whether working in enterprise -environments or as individual consultants, those who become Node.js Certified -Developers will be well-positioned to hit the ground running as a Node.js -developer, possessing skills that are in high demand,” said Tracy Hinds, -education community manager for the Node.js Foundation. - -The Node.js Certified Developer program, which is being developed with input from -leading Node.js experts and contributors, is expected to be available in Q2 of 2017. The program will provide a framework for general Node.js competency, -helping enterprises quickly identify qualified Node.js engineers, while -providing developers, contractors and consultants with a way to differentiate -themselves in the market. - -Node.js Foundation is worked closely with [The Linux Foundation](https://training.linuxfoundation.org/certification/why-certify-with-us) to create the blueprint -and process for administering the program. The Linux Foundation offers a -neutral home for running training and certification programs, thanks to its -close involvement with the open source community. It offers several open online -courses (MOOCs), including an [Intro to Linux](https://www.edx.org/course/introduction-linux-linuxfoundationx-lfs101x-0), [Intro to DevOps: -Transforming and Improving Operations](https://www.edx.org/course/introduction-devops-transforming-linuxfoundationx-lfs161x); [Developing -Applications for Linux](https://training.linuxfoundation.org/linux-courses/development-training/developing-applications-for-linux); [Kubernetes -Fundamentals](https://training.linuxfoundation.org/linux-courses/system-administration-training/kubernetes-fundamentals); among many others. - -Ideal Node.js Certified Developer candidates are early intermediate-level developers -who can already work proficiently in JavaScript with the Node.js platform. -Pricing for the self-paced, online exam is still -to be determined. - -Currently the Node.js Foundation is working with the community to determine specific questions that will be asked on the exam. To -contribute to the Node.js Foundation Certification Development Item Writing -Workshop Sessions, fill out this [application](https://docs.google.com/a/linuxfoundation.org/forms/d/10X9RJ4oLu2IU7cXppnXmwDMdJTetq3i9focw-R7GB8s/viewform?edit_requested=true). - -Exam topics will be published publically as will resources to help prepare for the -certification, allowing others to leverage the source materials for their own -Node.js learning. diff --git a/locale/fa/blog/announcements/nodejs-foundation-momentum-release.md b/locale/fa/blog/announcements/nodejs-foundation-momentum-release.md deleted file mode 100644 index 6ae0d104abf5..000000000000 --- a/locale/fa/blog/announcements/nodejs-foundation-momentum-release.md +++ /dev/null @@ -1,42 +0,0 @@ ---- -title: The Node.js Platform and Node.js Foundation Continue to Grow -date: 2016-11-30T12:00:00.000Z -category: Annoucements -slug: node.js-foundation-momentum-release -layout: blog-post.hbs ---- - -> With 5.7 million users; increased community participation and a solid Foundation backing: 2016 was a good year for the platform. - -**NODE.JS INTERACTIVE 2016, AUSTIN, TX., Nov. 30, 2016** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced [Snyk](https://snyk.io) as a Silver Member, major community and code growth, and the opening of an expansive Node.js user survey. - -Founded in 2015, the Node.js Foundation was created to accelerate the development of Node.js and support the large ecosystem that encompasses it through open governance. Membership has grown 30 percent growth since the Foundation’s inception and represents a mix of Fortune 500 companies and startups alike. The newest member Snyk is a security company that finds, fixes and monitors known vulnerabilities in Node.js and Ruby on Rails applications. - -“Snyk is focused on securing the vast module ecosystem that makes Node.js one of the most powerful runtime platforms today,” said Guy Podjarny, CEO and co-founder of Snyk. “We are excited to join the Node.js Foundation as our efforts align with the organization's focus to support and grow Node.js and its module ecosystem.” - -With more than 15 million downloads per month and more than a billion package downloads per week, Node.js is considered the biggest open source platform powering everything from web, IoT and desktop applications to microservice architectures. In 2016, the Node.js Project issued 63 new releases with seven different release managers. Node.js version 4 was the most popular release with release downloads increasingly 220% year over year. - -From a community growth standpoint, there were more than twice the number of new contributors than in 2015 and 1.5 times the number of unique contributors to the codebase per month compared to 2015. - -“The Node.js Project is focused on a new type of open source contribution philosophy, participatory governance, which liberalizes contribution policies and provides more direct ownership to contributors,” said Mikeal Rogers, community manager of the Node.js Foundation. “Through this approach, we’ve seen an explosion in contributor growth, which is critical to sustaining such an important open source project.” - -The second Node.js Interactive North America is in full swing with more than 700 developers, DevOps professionals, IoT engineers, engineering managers, and more in Austin. Node.js Interactive brings together a broad range of speakers to help experienced and new technologists better understand the Node.js platform and get insights into the future development of the project. - -Attendees are also getting a first look at Node.js advancements announced and demoed this week including: - -**The Node.js Foundation** announced progress with efforts to make Node.js VM-neutral - more information on this news can be found on the Node.js Foundation [Medium blog](https://medium.com/@nodejs/ibm-intel-microsoft-mozilla-and-nodesource-join-forces-on-node-js-48e21ffb697d#.jylk1mc0l). This morning, the Foundation announced it would oversee the Node.js Security Project to further improve stability for enterprises. More information [here](http://www.marketwired.com/press-release/nodejs-foundation-to-oversee-nodejs-security-project-to-further-improve-stability-enterprises-2179602.htm).     - -**NodeSource** announced NodeSource Certified Modules™ to bring security and trust to untrusted, third-party JavaScript. With NodeSource Certified Modules, consumers of the npm ecosystem can now rely on NodeSource as a secure, trusted and verifiable source. Learn more [here](https://certified.nodesource.com/). The team is also demoing its latest [N|Solidv2.0](https://nodesource.com/products/nsolid). - -During the conference, the Node.js Foundation also launched **its second user survey**, which will remain open until the end of December. This survey builds on the questions asked in the first Node.js Foundation survey, which was conducted in January 2016, and adds a number of questions designed to shed even more light on who uses Node.js, where they are located, how they learned Node.js, what they use it for, what other technologies they use it with and more. - -As with the January survey, the Node.js Foundation will produce and make available publicly for free a report with the survey’s key findings. If you use Node.js, please take the survey and tell your friends and colleagues: [https://www.surveymonkey.com/r/Node16](https://www.surveymonkey.com/r/Node16) - -If you are interested in receiving the latest updates from the conference and what’s to come in 2017, be sure to follow [@nodejs](https://twitter.com/nodejs) on Twitter and subscribe to the [Node.js Foundation mailing list](http://go.linuxfoundation.org/l/6342/2015-09-15/2sgqpp). - -About Node.js Foundation -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 4.5 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks, and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members GoDaddy, IBM, Intel, Joyent, Microsoft, NodeSource, PayPal, and Red Hat. Silver members include Apigee, AppDynamics, Cars.com, Codefresh, DigitalOcean, Dynatrace, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, Snyk, StrongLoop (an IBM company), Sphinx, YLD, and Yahoo!. Get involved here: [https://nodejs.org](https://nodejs.org/). - -The Node.js Foundation is a Linux Foundation Project, which are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [www.linuxfoundation.org](http://www.linuxfoundation.org/) diff --git a/locale/fa/blog/announcements/nodejs-foundation-survey.md b/locale/fa/blog/announcements/nodejs-foundation-survey.md deleted file mode 100644 index f68a2f1a308a..000000000000 --- a/locale/fa/blog/announcements/nodejs-foundation-survey.md +++ /dev/null @@ -1,119 +0,0 @@ ---- -title: New Node.js Foundation Survey Reports New “Full Stack” In Demand Among Enterprise Developers -date: 2016-04-12T13:00:00.000Z -status: publish -category: Annoucements -slug: nodejs-foundation-survey -layout: blog-post.hbs ---- - -> Nearly 50 percent of Node.js developers surveyed using container technology, strong growth emerges in cloud, front end, mobile and devices - -**SAN FRANCISCO, April, 12, 2016** — [The Node.js Foundation](http://ctt.marketwire.com/?release=11G082331-001&id=8448115&type=0&url=https%3a%2f%2fnodejs.org%2fen%2ffoundation%2f), -a community-led and industry-backed consortium to advance the development of the Node.js -platform, today announced the availability of its first ever Node.js User Survey Report. - -With over 3.5 million users and an annual growth rate of 100 percent, Node.js is emerging as -a universal platform used for web applications, IoT, and enterprise. The Node.js User Survey -report features insights on emerging trends happening in this massive community that serves -as a leading indicator on trends like microservices architectures, real-time web applications, -Internet of Things (IoT). The report paints a detailed picture of the technologies that are -being used, in particular, with Node.js in production and language preferences (current and -future) for front end, back end and IoT developers. - -## Key findings from the Node.js Foundation survey - -### Node.js and Containers Take Off Together - -Both Node.js and containers are a good match for efficiently developing and deploying -microservices architectures. And, while the surge in container use is relatively new, **45 -percent of developers that responded to the survey use Node.js with the technology**. Other -container-related data points: - -* 58 percent of respondents that identified as IoT developers use Node.js with Docker. -* 39 percent of respondents that identified as back end developers use Node.js with Docker. -* 37 percent of respondents that identified as front end developers use Node.js with Docker. - -### Node.js — the Engine that Drives IoT - -JavaScript and Node.js have risen to be the language and platform of choice for IoT as both -are suited for data intensive environments that require parallel programming without -disruption. JavaScript, including Node.js and frameworks, such as React, have become the de -facto choice of developers working in these connected, device-driven environments with **96 -percent of IoT respondents indicating they use JavaScript/Node.js for development**. - -“Data about developer choices is catnip for developers,” said James Governor, RedMonk -co-founder. “In this survey, the Node.js Foundation identifies some interesting results, -notably about languages programmers are using alongside Node.js and IoT demographics.” - -These environments are challenging, and the survey revealed that on average, IoT developers -using Node.js have more experience than their front end and back end counterparts with more -than 40 percent of IoT developers surveyed having over 10+ years of development experience. - -Additionally, although Docker is a server technology, many IoT developers (58%) are using -Node.js with Docker compared to only 39 percent of back end developers. This metric is -significant as it means that the new IoT world also is quickly adopting containers and -microservices. - -### Node.js Becoming Universal Platform - -**The full stack is no longer “front end and back end,” but rather “front end, back end and -connected devices,”** which is a combination of everything from the browser to a toaster all -being run in JavaScript and enabled by Node.js. The survey revealed that 62 percent of -respondents are using Node.js for both front end and back end development, and nearly 10 -percent are using Node.js for front end, back end, and IoT development. - -### Node.js Pervasive in Enterprises - -Node.js is increasingly used in the enterprise, and used within huge enterprises like PayPal, -Go Daddy, Capital One, and Intel. The survey found: - -* **More than 45 percent already using the Node.js Long Term Support release (v4) geared -toward medium to large enterprise users who require stability and high performance.** -* Of those who haven’t upgraded, 80 percent report definite plans to upgrade to v4, with half -of respondents planning to do so this year. -* Strong interest in enterprise tooling among 34 percent of tech leaders. - -### Full “MEAN” Stack Explodes - -The popularity of real-time, social networking and interactive game applications is pushing a -new stack among developers. The MEAN stack is able to handle lots of concurrent connections -and extreme scalability, which these applications demand. Node.js, in combination with -MongoDB, Express, AngularJS, allows developers to tackle the needs of front end and back end -development. Not surprisingly, all of these technologies were commonly used alongside -Node.js. **Express, cited the most, is used by an average of 83 percent of developers**. - -### Popularity of JavaScript and Node.js - -JavaScript and Node.js were popular among back end, front end, and IoT developers. Other -languages, beyond JavaScript, that were popular for all three developer types included PHP, -Python and Java. However, when looking to the future, back end, front end and IoT developers -planned to decrease their use of Java, .Net and PHP (PHP averages a 15% decrease) and -increase the use of Python and C++. - -## About the Survey - -The survey was open for 15 days, from January 13 to January 28, 2016. During this time, 1,760 -people from around the world completed the survey. Seventy percent were developer's, 22 -percent technical management and 64 percent run Node.js in production. Geographic -representation of survey covered: 35 percent from United States, 22 percent from Continental -Europe, 6 percent India, and 6 percent from United Kingdom with the remaining respondents -hailing from Asia, Latin America, Africa, Russia and the Middle East. - -**Additional Resources:** -* [Node.js Foundation User survey infographic](/static/documents/2016-survey-infographic.png) -* [Report summarizing Node.js Foundation User Survey 2016](/static/documents/2016-survey-report.pdf) - -**About Node.js Foundation** - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses -more than 3 million active users per month. It is the runtime of choice for high-performance, -low latency applications, powering everything from enterprise applications, robots, API -engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, -IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource -and Modulus/Progress Software, and Silver members include Apigee, AppDynamics, Codefresh, -DigitalOcean, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce -Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD!, and Yahoo!. Get involved here: -[https://nodejs.org](https://nodejs.org). diff --git a/locale/fa/blog/announcements/nodejs-security-project.md b/locale/fa/blog/announcements/nodejs-security-project.md deleted file mode 100644 index 3c7e379a7133..000000000000 --- a/locale/fa/blog/announcements/nodejs-security-project.md +++ /dev/null @@ -1,40 +0,0 @@ ---- -title: Node.js Foundation To Oversee Node.js Security Project To Further Improve Stability for Enterprises -date: 2016-11-30T12:00:00.000Z -category: Annoucements -slug: foundation-adds-node.js-security-project -layout: blog-post.hbs ---- - -> Node.js Security Project to become one of the largest community projects focused on detecting and fixing vulnerabilities for the fast-growing platform - -**SAN FRANCISCO, Nov. 30, 2016** — [The Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced that the Node.js Security Project will become a part of the Node.js Foundation. Under the Node.js Foundation, the Node.js Security Project will provide a unified -process for discovering and disclosing security vulnerabilities found in the Node.js module ecosystem. - -Last year Node.js Foundation worked with The Linux Foundation’s Core Infrastructure Initiative to form the Node.js Core Security Group to encourage security best practices. By overseeing datasets of vulnerability disclosures, which will be publically available and openly licensed, the Foundation is building on this work and expanding its role in fortifying Node.js through strong security governance. It will also allow the Foundation to drive standardization around security data and encourage a broader ecosystem of open source and vendor based tools on top of it. - -All security vendors are encouraged to contribute to the common vulnerability repository. Once it is openly licensed, the Foundation expects the repository to grow quickly as other vendors add to it. - -With 15 million downloads per month, more than a billion package downloads per week, and growing adoption across numerous industries, Node.js and its module ecosystem underpins some of the most heavily used desktop, web, mobile, cloud and IoT applications in the world. The need for a more open, robust, and standard process for finding and fixing vulnerabilities -within the module ecosystem that surrounds Node.js is essential, according to Mikeal Rogers, community manager for Node.js Foundation. - -“The Node.js Security Project will become one of the largest projects to build a community around detecting and fixing vulnerabilities,” said Rogers. “Given the maturity of Node.js and how widely used it is in enterprise environments, it makes sense to tackle this endeavor under open governance facilitated by the Node.js Foundation. This allows for more collaboration and communication within the broad community of developers and end users, ensuring the stability and longevity of the large, continually growing Node.js ecosystem.” - -A Node.js Security Project Working Group will be established in the next few weeks to begin validating vulnerability disclosures and maintaining the base dataset. Individuals and anyone from the Technical Steering Committee and Core Technical Committee are encouraged to join the working group and provide input on GitHub. - -The Node.js Security Project, founded by Adam Baldwin and previously managed by [^Lift Security](https://liftsecurity.io/), an application security company, collects data around vulnerability and security flaws in the Node.js module ecosystem. The Node.js Foundation will take over the following responsibilities from ^Lift: - - * Maintaining an entry point for ecosystem vulnerability disclosure; - * Maintaining a private communication channel for vulnerabilities to be vetted; - * Vetting participants in the private security disclosure group; - * Facilitating ongoing research and testing of security data; - * Owning and publishing the base dataset of disclosures, and - * Defining a standard for the data, which tool vendors can build on top of, and security and vendors can add data and value to as well. - -“We are very excited about the opportunity to donate this project to the Node.js Foundation” said Adam Baldwin, team lead at ^[Lift](https://liftsecurity.io/)[Security](https://liftsecurity.io/) and founder of the Node.js Security Project. “The Foundation will be able to funnel contributions from numerous vendors, developers and end users to create an incredibly useful baseline of data sets that will be available to anyone. This ensures broader reach and long-lasting viability of the project to encourage availability of more security tools, which is increasingly in demand among Node.js enterprise developers and users.” - -^Lift plans to provide upstream contributions to the project based on any new flaws their team uncovers through working with their customers. - -### About the Node.js Foundation - -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 4.5 million active users per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks, and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members GoDaddy, IBM, Intel, Joyent, Microsoft, NodeSource, PayPal, and Red Hat. Silver members include Apigee, AppDynamics, Cars.com, Codefresh, DigitalOcean, Dynatrace, Fidelity,Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD, and Yahoo!. Get involved here: [https://nodejs.org](https://nodejs.org/). diff --git a/locale/fa/blog/announcements/v6-release.md b/locale/fa/blog/announcements/v6-release.md deleted file mode 100644 index 806d000451b6..000000000000 --- a/locale/fa/blog/announcements/v6-release.md +++ /dev/null @@ -1,80 +0,0 @@ ---- -title: World’s Fastest Growing Open Source Platform Pushes Out New Release -date: 2016-04-26T12:00:00.000Z -status: publish -category: Annoucements -slug: v6-release -layout: blog-post.hbs ---- - -> New “Current” version line focuses on performance improvements, increased reliability and -better security for its 3.5 million users - -SAN FRANCISCO, April, 26, 2016 — [The Node.js Foundation](http://ctt.marketwire.com/?release=11G082331-001&id=8448115&type=0&url=https%3a%2f%2fnodejs.org%2fen%2ffoundation%2f), a -community-led and industry-backed consortium to advance the development of the Node.js -platform, today announced the release of Node.js version 6 (Node.js v6). This release -provides major performance improvements, increased reliability and better security. - -With over 3.5 million users and an annual growth rate of 100 percent, Node.js is emerging as -a universal platform used for web applications, IoT, mobile, enterprise application -development, and microservice architectures. The technology is ubiquitous across numerous -industries, from startups to Fortune 500 companies, and is the only unified platform that -full stack JavaScript developers can use for front end, back end, mobile and IoT projects. - -Performance improvements are key in this latest release with one of the most significant -improvements coming from module loading, which is currently four times faster than Node.js -version 4 (Node.js v4). This will help developers dramatically decrease the startup time of -large applications for the best productivity in development cycles and more seamless -experience with end users. In addition, Node.js v6 comes equipped with v8 JavaScript engine -5.0, which has improved ECMAScript 2015 (ES6) support. Ninety-three percent of -[ES6](http://node.green/) features are also now supported in the Node.js v6 release, up from -56 percent for Node.js v5 and 50 percent for Node.js v4. Key features from ES6 include: -default and rest parameters, destructuring, class and super keywords. - -Security is top-of-mind for enterprises and startups alike, and Node.js v6 has added several -features that impact security, making it easier to write secure code. The new Buffer API will -reduce the risk of bugs and vulnerabilities leaking into applications through a new -constructor method used to create Buffer instances, as well as a zero-fill-buffers -command-line flag. Using the new command line flag, developers can continue to safely use -older modules that have not been updated to use the new constructor API. In addition, V8 has -improved their implementation of Math.random() to be more secure — this feature is added into -Node.js v6. - -“The Node.js Project has done an incredible job of bringing this version to life in the -timeline that we initially proposed in September 2015. It’s important for us to continue to -deliver new versions of Node.js equipped with all the cutting-edge JavaScript features to -serve the needs of developers and to continue to improve the performance and stability -enterprises rely on,” said Mikeal Rogers, Community Manager of the Node.js Foundation. “This -release is committed to Long Term Support, which allows predictable long-term stability, -reliability, performance and security to the growing number of enterprise users that are -adopting Node.js as a key technology in their infrastructure.” - -To increase reliability of Node.js, there has been increased documentation and testing done -around Node.js v6 for enterprises that are using and looking to implement the platform. - -Node.js release versioning follows the Semantic Versioning Specification, a specification for -version numbers of software libraries similar to dependencies. Under the Node.js’ [Long-Term -Support (LTS)](https://github.com/nodejs/LTS/), version 6 is now the “Current” release line -while version 5 will be maintained for a few more months. In October 2016, Node.js v6 will -become the LTS release and the LTS release line (version 4) will go under maintenance mode in -April 2017, meaning only critical bugs, critical security fixes and documentation updates -will be permitted. Users should begin transitioning from v4 to v6 in October when v6 goes -into LTS. - -Additional Resources -* [Download version 6](https://nodejs.org/download/release/v6.0.0/) -* [Download version 4](https://nodejs.org/en/download/) -* [Technical blog with additional new features and updates](https://nodejs.org/en/blog/) - -About Node.js Foundation -Node.js is used by tens of thousands of organizations in more than 200 countries and amasses -more than 3.5 million active users per month. It is the runtime of choice for -high-performance, low latency applications, powering everything from enterprise applications, -robots, API engines, cloud stacks and mobile websites. - -The Foundation is made up of a diverse group of companies including Platinum members Famous, -IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource -and Modulus/Progress Software, and Silver members include Apigee, AppDynamics, Codefresh, -DigitalOcean, Fidelity, Google, Groupon, nearForm, New Relic, npm, Opbeat, RisingStack, Sauce -Labs, SAP, StrongLoop (an IBM company), Sphinx, YLD!, and Yahoo!. Get involved here: -[https://nodejs.org](https://nodejs.org). diff --git a/locale/fa/blog/announcements/welcome-google.md b/locale/fa/blog/announcements/welcome-google.md deleted file mode 100644 index 944d7b8b67d4..000000000000 --- a/locale/fa/blog/announcements/welcome-google.md +++ /dev/null @@ -1,18 +0,0 @@ ---- -title: Welcome Google Cloud Platform! -date: 2016-03-29T13:00:00.000Z -status: publish -category: Annoucements -slug: welcome-google -layout: blog-post.hbs ---- - -Google Cloud Platform joined the Node.js Foundation today. This news comes on the heels of the Node.js runtime going into beta on [Google App Engine](https://cloudplatform.googleblog.com/2016/03/Node.js-on-Google-App-Engine-goes-beta.html), a platform that makes it easy to build scalable web applications and mobile backends across a variety of programming languages. - -In the industry, there’s been a lot of conversations around a third wave of cloud computing that focuses less on infrastructure and more on microservices and container architectures. Node.js, which is a cross-platform runtime environment that consists of open source modules, is a perfect platform for these types of environments. It’s incredibly resource-efficient, high performing and well-suited to scalability. This is one of the main reasons why Node.js is heavily used by IoT developers who are working with microservices environments. - -“Node.js is emerging as the platform in the center of a broad full stack, consisting of front end, back end, devices and the cloud,” said Mikeal Rogers, community manager of the Node.js Foundation. “By joining the Node.js Foundation, Google is increasing its investment in Node.js and deepening its involvement in a vibrant community. Having more companies join the Node.js Foundation helps solidify Node.js as a leading universal development environment.” - -Along with joining the Node.js Foundation, Google develops the V8 JavaScript engine which powers Chrome and Node.js. The V8 team is working on infrastructural changes to improve the Node.js development workflow, including making it easier to build and test Node.js on V8’s continuous integration system. Google V8 contributors are also involved in the Core Technical Committee. - -The Node.js Foundation is very excited to have Google Cloud Platform join our community and look forward to helping developers continue to use Node.js everywhere. diff --git a/locale/fa/blog/announcements/welcome-redhat.md b/locale/fa/blog/announcements/welcome-redhat.md deleted file mode 100644 index ab2e1987d8b8..000000000000 --- a/locale/fa/blog/announcements/welcome-redhat.md +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Node.js Foundation Welcomes Red Hat as Newest Platinum Member -date: 2015-10-06T12:30:00.000Z -status: publish -category: Annoucements -slug: welcome-redhat -layout: blog-post.hbs ---- - -# Node.js Foundation Welcomes Red Hat as Newest Platinum Member - -> Company Looks to Accelerate Node.js Adoption for Enterprise Software Development - -**SAN FRANCISCO, Oct. 6, 2015** – The [Node.js Foundation](https://foundation.nodejs.org/), a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced Red Hat, Inc. has joined the Foundation as a Platinum member. Red Hat joins platinum members, including Famous, IBM, Intel, Joyent, Microsoft and PayPal, to provide support in the adoption, development and long-term success of the Node.js project. - -Node.js is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications to robots. Over the last two years, more large enterprises, including Red Hat, IBM, PayPal, Fidelity, and Microsoft, have adopted Node.js as part of their enterprise fabric. Today there are 2 million unique IP addresses installing Node.js packages and more than 2 billion package downloads in the last month. - -Often used for building fast, scalable network applications, Node.js supports Red Hat technologies such as [Red Hat Mobile Application Platform](https://www.redhat.com/en/technologies/mobile/application-platform), and is available in [OpenShift by Red Hat](https://www.openshift.com/) and [Red Hat Software Collections](http://developerblog.redhat.com/tag/software-collections/). As a new member, Red Hat is providing financial support, technical contributions, and high-level policy guidance for the newly formed Foundation that operates as a neutral organization to support the project governed by the Node.js community. - -“Node.js has become an important tool for developers who need to build and deploy a new generation of highly responsive, scalable applications for mobile and Internet of Things (IoT),” said Rich Sharples, senior director, Product Management at Red Hat. “We welcome deeper collaboration with the Node.js Foundation and broader community, and look forward to helping increase the role that the technology plays in enterprise software development.” - -“Node.js is exploding in popularity in almost every aspect of technology from microservices architecture to data-intensive applications that run across distributed devices,” said Danese Cooper, Chairperson of the Node.js Foundation Board. “It is a pivotal moment for the technology, and the support of Foundation members is imperative to ensure that Node.js stays relevant and addresses topical projects and problems that are happening within the wider Node.js community.” - -Additional Resources -* Learn more about the [Node.js Foundation](https://foundation.nodejs.org/) and get involved with [the project](https://nodejs.org/en/get-involved/). - -### About Node.js Foundation - -Node.js Foundation is a collaborative open source project dedicated to building and supporting the Node.js platform and other related modules. Node.js is used by tens of thousands of organizations in more than 200 countries and amasses more than 2 million downloads per month. It is the runtime of choice for high-performance, low latency applications, powering everything from enterprise applications, robots, API engines, cloud stacks and mobile websites. The Foundation is made up of a diverse group of companies including Platinum members Famous, IBM, Intel, Joyent, Microsoft, PayPal and Red Hat. Gold members include GoDaddy, NodeSource and Modulus/Progress Software, and Silver members include Apigee, Codefresh, DigitalOcean, Fidelity, Groupon, nearForm, npm, Sauce Labs, SAP, and YLD!. Get involved here: [https://nodejs.org](https://nodejs.org). - -The Node.js Foundation is a Collaborative Project at The Linux Foundation. Linux Foundation Collaborative Projects are independently funded software projects that harness the power of collaborative development to fuel innovation across industries and ecosystems. [https://foundation.nodejs.org/](https://foundation.nodejs.org/) diff --git a/locale/fa/blog/community/2017-election.md b/locale/fa/blog/community/2017-election.md deleted file mode 100644 index ec2b55dbb99e..000000000000 --- a/locale/fa/blog/community/2017-election.md +++ /dev/null @@ -1,83 +0,0 @@ ---- -title: Node.js Foundation Individual Membership Director election opens Friday, January 20 -date: 2017-01-20T09:00:00.000Z -status: publish -category: Community -slug: 2017-election-nodejs-foundation -layout: blog-post.hbs -author: Tracy Hinds ---- - -The Node.js Foundation is a member-supported organization. The Node.js -Foundation Individual Director is the Node.js project’s community voice on the -board. There are two individual directors that sit on the Node.js Foundation -board and they serve a two-year term. - ->> “Having the community represented is extremely important for open source -projects with a Board of Directors as diverse as ours. This gives the community -a voice on the board and helps to guide how we make the investment decisions. -The individual board members are able to provide feedback on how proposed board -items might impact the community and can provide suggestions on how the Node.js -Foundation can better support the community. With this knowledge and feedback, -the board is able to contribute the resources the community needs to stay -healthy and continue to grow.” - ->> *Todd Moore, Node.js Foundation Director* - -The Individual Membership Director is responsible for soliciting feedback and -data that represents the wishes of other individual members and the community at -large. They have been entrusted with the duty to make decisions based on the -information they receive to best represent the community, and can gather input -for proposals when relevant and granted permission to do so. - -This includes participating and voting in Board meetings, introducing and -driving forward initiatives to conclusion that capture the mission of the -Node.js project, and representing the Board at speaking engagements (this is by -no means a comprehensive list). Read more about [“What’s it like being on the -Node.js Foundation Board of -Directors?”](https://medium.com/@nodejs/whats-it-like-being-on-the-node-js-foundation-board-of-directors-f9456b8b7c4d). - -### What does the Board of Directors do? -The Board meets every month to approve resolutions and discuss Node.js -Foundation administrative matters. This includes legal considerations, budgeting -and approving Foundation-led conferences and other initiatives. Technical -governance is overseen by the TSC, not the Board of Directors. - -The current board members are listed -[here](https://foundation.nodejs.org/board). - -### Who is running for the Individual Membership Director seat? -Read more about why our candidates are running below. - -- William Kapke [@williamkapke](https://github.com/williamkapke) - - [I'm running for the Node.js Board of Directors!](https://www.youtube.com/watch?v=zPBOkqclJFc&feature=youtu.be) -- Kat Marchán [@zkat](https://github.com/zkat) - - [Director nomination for Kat Marchán](https://gist.github.com/zkat/345d1485fc4cd1f45155678a3729cd21) -- Charlie Robbins [@indexzero](https://github.com/indexzero) - - [I'm running to increase transparency to the community from the Node.js Board - of Directors](https://medium.com/@indexzero/vote-to-increase-transparency-in-the-node-js-foundation-4a2b22ffaada) -- William P. Riley-Land [@wprl](https://github.com/wprl) - - [I Would Like to Represent Individual Members of the Node.js Foundation](https://medium.com/@wprl/i-would-like-to-represent-individual-members-of-the-node-js-foundation-977157d90aa0#.hq3vo8d8m) - -### When is the election? Nominations were solicited until January 15th. -- Ballot will be distributed on January 20th. -- The election will close **January 30th at 17:00 UTC**. - -### How do I vote? -You must be an Individual Member of the Node.js Foundation to cast a vote. If -you are a member, you can vote [NOW](https://vote.linuxfoundation.org)! - -### How do I become a member? -Individual membership costs [$100 a year, or $25 for students](https://identity.linuxfoundation.org/pid/99). -Contributors to the Node.js project, including all Working Groups and -sub-projects, are eligible for free membership. Please -[contact us](mailto:membership@nodejs.org) for discount codes. You are -required to have a GitHub account to register. - -### What’s the benefit of being an individual member? - - You have a vote and voice on the Node.js Foundation Board of Directors - through the two above-mentioned elected Individual Membership Directors. - - 20% off regular price registration to Node Interactive 2017 - - -See you at the (digital)poll? diff --git a/locale/fa/blog/community/building-nodejs-together.md b/locale/fa/blog/community/building-nodejs-together.md deleted file mode 100644 index b529069745f8..000000000000 --- a/locale/fa/blog/community/building-nodejs-together.md +++ /dev/null @@ -1,188 +0,0 @@ ---- -title: Building Node.js Together -author: tjfontaine -date: 2014-07-29T21:00:00.000Z -status: publish -category: Community -slug: building-nodejs-together -layout: blog-post.hbs ---- - -Node.js is reaching more people than ever, it's attracting new and interesting -use cases, at the same time as seeing heavy adoption from traditional -engineering departments. Managing the project to make sure it continues to -satisfy the needs of its end users requires a higher level of precision and -diligence. It requires taking the time to communicate and reach out to new and -old parties alike. It means seeking out new and dedicated resources. It means -properly scoping a change in concert with end users, and documenting and -regularly check pointing your progress. These are just some of the ways we're -working to improve our process, and work to deliver higher quality software -that meets our goals. - -## Documentation - -One of the big things we've wanted to do is to change the way the website -works, which is something I've [mentioned -before](http://blog.nodejs.org/2014/01/16/nodejs-road-ahead/). It should be a -living breathing website whose content is created by our end users and team. -The website should be the canonical location for documentation on how to use -Node.js, how Node.js works, and how to find out what's going on in the Node -community. We have seeded the initial documentation with [how to -contribute](https://nodejs.org/en/get-involved/contribute/), [who the core team -is](https://nodejs.org/en/about/organization/#index_md_technical_steering_committee), -and some basic documentation of the [project -itself](https://nodejs.org/en/about/organization). From there we're looking to -enable the community to come in and build out the rest of the framework for -documentation. - -One of the key changes here is that we're extending the tools that generate API -documentation to work for the website in general. That means the website is now -written in markdown. Contributions work with the same -[pull-request](https://nodejs.org/en/get-involved/contribute/#code-contributions) -way as contributions to Node itself. The intent here is to be able to quickly -generate new documentation and improve it with feedback from the community. - -The website should also be where we host information about where the project is -going and the features we're currently working on (more about that later). But -it's crucial we communicate to our end users what improvements will be coming, -and the reasons we've made those decisions. That way it's clear what is coming -in what release, and also can inspire you to collaborate on the design of that -API. This is not a replacement for our issue tracking, but an enhancement that -can allow us to reach more people. - -## Features - -Which brings us to the conversation about features. During the Q & A portions -of the [Node.js on the -Road](http://blog.nodejs.org/2014/06/11/notes-from-the-road/) events there are -often questions about what does and doesn't go into core. How the team -identifies what those features are and when you decide to integrate them. I've -spent a lot of time talking about that but I've also -[added](https://nodejs.org/en/about/organization) it to the new documentation on -the site. - -It's pretty straight forward, but in short if Node.js needs an interface to -provide an abstraction, or if everyone in the community is using the same -interface, then those interfaces are candidates for being exposed as public -interfaces for Node. But what's important is that the addition of an API should -not be taken lightly. It is important for us to consider just how much of an -interface we can commit to, because once we add the API it's incredibly hard -for us to change or remove it. At least in a way that allows people to write -software that will continue to work. - -So new features and APIs need to come with known use cases and consumers, and -with working test suites. That information is clearly and concisely present on -the website to reach as wide of an audience as possible. Once an implementation -meets those requirements it can be integrated into the project. Then and only -then, when we have an implementation that meets the design specification and -satisfies the test suite, will we be able to integrate it. That's how we'll -scope our releases going forward, that's how we'll know when we're ready to -release a new version of Node. This will be a great change for Node, as it's a -step forward on moving to an always production ready master branch. - -## Quality Software - -And it's because Node.js is focused on quality software and a commitment to -backwards compatibility that it's important for us to seek ways to get more -information from the community about when and where we might be breaking them. -Having downstream users test their code bases with recent versions of Node.js -(even from our master branch) is an important way we derive feedback for our -changes. The sooner we can get that information, the more test coverage we can -add, the better the software we deliver is. - -Recently I had the opportunity to speak with [Dav -Glass](http://twitter.com/davglass) from [Yahoo!](http://yahoo.com), and we're -going to be finding ways to get automated test results back from some larger -test suites. The more automation we can get for downstream integration testing -the better the project can be at delivering quality software. - -If you're interested in participating in the conversation about how Node.js can -be proactively testing your software/modules when we've changed things, please -[join the conversation](http://github.com/joyent/node/issues). - -## Current release - -Before we can release v0.12, we need to ensure we're providing a high quality -release that addresses the needs of the users as well as what we've previously -committed to as going into this release. Sometimes what can seem like an -innocuous change that solves an immediate symptom, doesn't actually treat the -disease, but instead results in other symptoms that need to be treated. -Specifically in our streams API, it can be easy to subtly break people while -trying to fix another bug with good intent. - -This serves as a reminder that we need to properly scope our releases. We need -to know who the consumers are for new APIs and features. We need to make sure -those features' test cases are met. We need to make sure we're adopting APIs -that have broad appeal. And while we're able to work around some of these -things through external modules and experimenting with JavaScript APIs, that's -not a replacement for quality engineering. - -Those are the things that we could have done better before embarking on 0.12, -and now to release it we need to fix some of the underlying issues. Moving -forward I'm working with consumers of the tracing APIs to work on getting a -maintainable interface for Node that will satisfy their needs. We'll publicly -document those things, we'll reach out to other stakeholders, and we'll make -sure that as we implement that we can deliver discretely on what they need. - -That's why it's important for us to get our releases right, and diagnose and -fix root causes. We want to make sure that your first experience with 0.12 -results in your software still working. This is why we're working with large -production environments to get their feedback, and we're looking for those -environments and you to [file bugs](https://github.com/joyent/node/issues) that -you find. - -## The Team - -The great part about Node's contribution process and our fantastic community is -that we have a lot of very enthusiastic members who want to work as much as -possible on Node. Maybe they want to contribute because they have free time, -maybe they want to contribute to make their job easier, or perhaps they want to -contribute because their company wants them to spend their time on open source. -Whatever the reason, we welcome contributions of every stripe! - -We have our core team that manages the day to day of Node, and that works -mostly by people wanting to maintain subsystems. They alone are not solely -responsible for the entirety of that subsystem, but they guide its progress by -communicating with end users, reviewing bugs and pull requests, and identifying -test cases and consumers of new features. People come and go from the core -team, and recently we've added [some -documentation](https://nodejs.org/en/about/organization) that describes how you -find your way onto that team. It's based largely around our contribution -process. It's not about who you work for, or about who you know, it's about -your ability to provide technical improvement to the project itself. - -For instance, Chris Dickinson was recently hired to work full time on Node.js, -and has expressed interest in working on the current and future state of -streams. But it's not who employs Chris that makes him an ideal candidate, but -it will be the quality of his contributions, and his understanding of the ethos -of Node.js. That's how we find members of the team. And Chris gets that, in -[his blog](http://neversaw.us/2014/05/08/on-joining-walmart-labs/) about -working full time on Node.js he says (and I couldn't have said it better -myself): - -> I will not automatically get commit access to these repositories — like any -other community member, I will have to continually submit work of consistent -quality and put in the time to earn the commit bit. The existing core team will -have final say on whether or not I get the commit bit — which is as it should -be! - -Exactly. And not only does he understand how mechanism works, but he's [already -started](http://neversaw.us/2014/07/13/june-recap/) getting feedback from -consumers of streams and documenting some of his plans. - -In addition to Chris being hired to work full time on Node.js, Joyent has -recently hired [Julien Gilli](https://github.com/misterdjules) to work full -time with me on Node. I'm really excited for all of the team to be seeking out -new contributors, and getting to know Chris and Julien. They're both fantastic -and highly motivated, and I want to do my best to enable them to be successful -and join the team. But that's not all, I've been talking to other companies who -are excited to participate in this model, and in fact -[Modulus.io](http://modulus.io) themselves are looking to find someone this -year to work full time on Node.js. - -Node.js is bigger than the core team, it's bigger than our community, and we -are excited to continue to get new contributors, and to enable everyone. So -while we're working on the project we can't just focus on one area, but instead -consider the connected system as a whole. How we scale Node, how we scale the -team, how we scale your contributions, and how we integrate your feedback -- -this is what we have to consider while taking this project forward, together. diff --git a/locale/fa/blog/community/foundation-benefits-all.md b/locale/fa/blog/community/foundation-benefits-all.md deleted file mode 100644 index a9d20b7da975..000000000000 --- a/locale/fa/blog/community/foundation-benefits-all.md +++ /dev/null @@ -1,90 +0,0 @@ ---- -title: The Node.js Foundation benefits all -author: Scott Hammond -date: 2015-05-15T22:50:46.000Z -status: publish -category: Community -slug: the-nodejs-foundation-benefits-all -layout: blog-post.hbs ---- - -When I joined Joyent last summer I quickly realized that, despite the huge -success of Node.js in the market and the tireless work of many here at Joyent, -there were challenges in the project that we needed to address. Through -discussions with various project contributors, Node.js users, ecosystem -vendors and the [Node.js Advisory Board](http://nodeadvisoryboard.com), it -became clear that the best way to address the concerns of all key stakeholders -(and the best thing for Node.js as a whole) was to establish the Foundation as -a path for the future. - -The biggest and most obvious challenge we sought to address with the -Foundation was the friction that existed amongst some developers in the -Node.js community. Historically, leadership ran the project fairly tightly, -with a small core of developers working in a BDFL model. It was difficult for -new people to join the project, and there wasn’t enough transparency for such -a diverse, passionate community to have a sense of ownership. Consequently, a -group of developers who wanted to operate under a more open governance model -created the io.js fork. That team has done a great job innovating on -governance and engagement models, and the Node.js Foundation’s models will be -based on those policies to ensure broader community engagement in the future -of Node.js. We welcome community review and feedback on [the draft governance -documents](https://github.com/joyent/nodejs-advisory-board/tree/master/governance-proposal). - -With the recent vote by the io.js TC to join the Node.js Foundation, we took a -giant leap toward rebuilding a unified community. @mikeal, @piscisaureus and -others have done an excellent job evangelizing the value of the Foundation, -and it’s great to see it have such positive impact this early in its -formation. - -Reunification of the Node.js developer community remains an important goal of -the Foundation. But to have a successful project, we must also maintain focus -on addressing the concerns of Node.js users and the ecosystem of vendors. If -we succeed, Node.js will continue its meteoric rise as the defacto server side -javascript platform, and everyone wins. If we get it wrong, we jeapordize the -momentum and critical mass that's driven that growth, and everyone loses. - -In the user community, enterprise adoption of Node.js has skyrocketed with an -abundance of success stories. But behind every successful project is someone -who is betting their career on the choice to build with Node.js. Their primary -“ask” is to de-risk the project. They want stable, production-grade code that -will handle their technical requirements and an LTS that matches what they get -from other software. The Foundation will get that right. Donations to the -Foundation will provide the resources we need to broaden and automate the -necessary test suites and expand coverage across a large set of platforms. We -are working now on codifying the LTS policy (comments welcome -[here](https://github.com/nodejs/dev-policy/issues/67)) and will establish the -right 6-9 month release cadence with rigor on backward compatibility and EOL -horizon. - -Users also want the project to be insulated from the direction of any single -company or individual. Putting the project into a foundation insulates it from -the commercial aspirations of Joyent or any other single company. It also -facilitates the creation of the vibrant vendor ecosystem around Node.js that -users want. Users want to see relevant innovation from a strong group of -contributors and vendors. - -The vendors themselves have a clear set of requirements that can best be -addressed by the Foundation. They want a level playing field and they want to -know they can monetize the contributions they make to the project. We need a -vibrant ecosystem to complete the solution for the users of Node.js and drive -additional value and innovation around the core project. The ecosystem is the -force multiplier of value for every piece of technology and Node.js is no -exception. - -Finally, in addition to risk mitigation, transparency, neutrality and an open -governance model, the Foundation will provide needed resources. Over the past -few years Joyent and other members of the community have invested thousands of -hours and millions of dollars into the project, and much has been -accomplished. Going forward, Joyent will continue to invest aggressively in -the success and growth of Node.js. But now, with the support of new Foundation -members, we will be able to do even more. Investments from new members can be -used to expand coverage of testing harnesses, establish API compatibility -tests and certifications, extend coverage for additional platforms, underwrite -travel expenses for technical meetups for core contributors, build training -programs for users and developers, expand community development efforts, fund -full-time developers and more. - -I’m convinced the Foundation is the best vehicle for balancing the needs of -Node.js users, vendors and contributors. The project has a brilliant future -ahead of it and I am more optimistic than ever that we can work together as -one strong community to secure that future. diff --git a/locale/fa/blog/community/index.md b/locale/fa/blog/community/index.md deleted file mode 100644 index 29d5ac3dfd5b..000000000000 --- a/locale/fa/blog/community/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Community -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/community/individual-membership.md b/locale/fa/blog/community/individual-membership.md deleted file mode 100644 index a44dad86fdb4..000000000000 --- a/locale/fa/blog/community/individual-membership.md +++ /dev/null @@ -1,49 +0,0 @@ ---- -title: Node.js Foundation Individual Membership Now Open -date: 2015-11-04T12:00:00.000Z -status: publish -category: Community -slug: individual-membership-nodejs-foundation -layout: blog-post.hbs -author: mikeal ---- - -The Node.js Foundation is a member-supported organization. To date we've added over 20 corporate members who provide the financial support necessary for the Foundation to thrive. - -With the support of the Linux Foundation we are now able to launch an Individual Membership program. These members will be electing two representatives to the Board of Directors this January who will be -responsible for representing the diverse needs of the Node.js community in the administration of the Node.js Foundation. - -## How do I become a member? - -Membership costs [$100 a year, or $25 for students](https://identity.linuxfoundation.org/pid/99). -Contributors to the Node.js project, including all Working Groups and sub-projects, are eligible for free membership. - -You are required to have a GitHub account to register. - -## Who can run for the board of directors? - -Any registered member. - -Keep in mind that every meeting of the Board must reach quorum in order to pass resolutions, so only people who can make themselves available on a recurring and consistent basis should consider running. - -## What does the Board of Directors do? - -The Board meets every month to approve resolutions and discuss Node.js Foundation administrative matters. This includes legal considerations, budgeting and approving Foundation-led conferences and other initiatives. Technical governance is overseen by the TSC, not the Board of Directors. - -The current board members are listed [here](https://foundation.nodejs.org/about/leadership). - -## What are the term lengths? - -The standard term length for those elected by the individual membership is 2 years, with an election each year to select a new representative for a new term. - -However, in the first election two representatives will be elected; the representative with the most votes will be elected for the standard 2 year term and the runner-up will serve a special 1-year term so that in 2017 we can elect a single new director for a 2 year staggered term. - -## When is the election? - -* Nominations are being solicited until January 15th. -* A ballot will be distributed on January 20th. -* The election will be completed by January 30th. - -## How do I run in the 2016 election? - -After you've registered as a member follow the instructions [here](https://github.com/nodejs/membership/issues/12). diff --git a/locale/fa/blog/community/next-chapter.md b/locale/fa/blog/community/next-chapter.md deleted file mode 100644 index d69957597278..000000000000 --- a/locale/fa/blog/community/next-chapter.md +++ /dev/null @@ -1,103 +0,0 @@ ---- -title: Next Chapter -author: tjfontaine -date: 2015-05-08T19:00:00.000Z -status: publish -category: Community -slug: next-chapter -layout: blog-post.hbs ---- - -Open source projects are about the software, the users, and the community. Since -becoming project lead in 2014, I've been privileged to be a part of the most -passionate, diverse, and vibrant community in the ecosystem. The community is -responsible for Node.js' meteoric rise and continued adoption by users and -companies all over the world. Given the strength of its community, I'm confident -that Node.js is heading in the right direction. With that said, it's time for me -to step back. - -For the past year, I've worked directly with community members to improve -Node.js, focusing on improving the parts of the project that benefit everyone. -We wanted to know what in Node.js was working for them and what wasn't. During -the life of a project, it's crucial to constantly reset yourself and not lose -sight of your identity. Node.js is a small set of stable core modules, doing one -thing, and one thing well. Every change we make, we tried to make sure we were -being true to ourselves and not violating our ethos. We've focused on -eliminating bugs and critical performance issues, as well as improving our -workflows. Ultimately, our goal was to ensure Node.js was on the right path. - -The formation of the Node.js Foundation couldn't have happened at a better time -in the life of Node.js. I believe this will be the tipping point that cements -Node's place in technology. Soon, the foundation will be announcing its first -meeting, initial membership, and future plans for Node.js. The project is on the -right path, has the right contributors and is not tied to one person. It has a -vibrant and loyal community supporting it. - -I want to take some time to highlight a few of those who have made an impact on -Node.js. This list only scratches the surface, but these are a few of the unsung -contributors that deserve some attention: - -Node.js wanted to have a [living breathing -site](https://github.com/joyent/node-website), one that could attract our -community and be the canonical source of documentation and tutorials for -Node.js. Leading the charge has been [Robert -Kowalski](https://github.com/robertkowalski) and [Wyatt -Preul](https://github.com/geek), who have been incredibly helpful to the Node.js -ecosystem in many ways, but most notably by helping breathe life in the website. - -One key point of the maturity for Node.js has been its growing predominance -worldwide. Therefore, we've been working to improve our support for -internationalization and localization. Node.js is so widely accepted that our -users need Node.js to support internationalization so they can better support -their own customers. Luckily, we have [Steven Loomis](https://github.com/srl295) -leading the charge on this — he has the unique privilege of being a member of -both ICU and Node.js. - -Node.js is seeing adoption across many new platforms, which means we need to -collaborate with the community to support those platforms. Much like we have -[Alexis Campilla](https://github.com/orangemocha) working to support the Windows -platform, we have people like [Michael Dawson](https://github.com/mhdawson) -working on adding support for PowerPC and zSeries. Additionally, he's been able -to leverage the technical depth of IBM to help squash bugs and do work on our VM -backend of V8. - -OpenSSL has had its share of issues recently, but it's not the only dependency -that can be sensitive to upgrade -- so many thanks go to [James -Snell](https://github.com/jasnell) for working to help simplify and manage those -upgrades. James has also been working together with our large, diverse, and -complex community to make sure our development policies are easy to understand -and approachable for other new contributors. - -Finally, I want to make a very special mention of [Julien -Gilli](https://github.com/misterdjules), who has been an incredible addition to -the team. Julien has been responsible for the last few releases of Node.js — -both the v0.10 and v0.12 branches. He's done wonders for the project, mostly -behind the scenes, as he has spent tons of time working on shoring up our CI -environment and the tests we run. Thanks to him, we were able to ship v0.12.0 -with all our tests passing and on all of our supported platforms. This was the -first Node.js release ever to have that feature. He has also been working -tirelessly to iterate on the process by which the team manages Node.js. Case in -point is the excellent -[documentation](https://nodejs.org/documentation/workflow/) he's put together -describing how to manage the workflow of developing and contributing to the -project. - -In short, hiring Julien to work full time on Node.js has been one of the best -things for the project. His care and concern for Node.js, its users, and their -combined future is evident in all of his actions. Node.js is incredibly lucky to -have him at its core and I am truly indebted to him. - -It's because of this strong team, community, and the formation of the Foundation -that it makes it the right time for me to step back. The foundation is here, the -software is stable, and the contributors pushing it forward are people I have a -lot of faith in. I can't wait to see just how far Node.js' star will rise. I am -excited to see how the contributors grow, shape and deliver on the promise of -Node.js, for themselves and for our users. - -Moving forward, I will still remain involved with Node.js and will provide as -much help and support to the rest of the core team as they need. However, I -won't have the time to participate at the level needed to remain a core -contributor. With the core team and the community working together, I know they -won't miss a step. - - diff --git a/locale/fa/blog/community/node-leaders-building-open-neutral-foundation.md b/locale/fa/blog/community/node-leaders-building-open-neutral-foundation.md deleted file mode 100644 index fb27e9f3dcc5..000000000000 --- a/locale/fa/blog/community/node-leaders-building-open-neutral-foundation.md +++ /dev/null @@ -1,126 +0,0 @@ ---- -title: Node.js and io.js leaders are building an open, neutral Node.js Foundation to support the future of the platform -author: Mike Dolan -date: 2015-05-15T23:50:46.000Z -status: publish -category: Community -slug: node-leaders-are-building-an-open-foundation -layout: blog-post.hbs ---- - -Just a couple months ago a variety of members of the Node.js and io.js -community announced they would discuss establishing a neutral foundation for -the community. The Linux Foundation has since been helping guide discussions -with contributors, developers, users and leaders in these communities, -increasingly expanding the scope of discussion to more stakeholders. Node.js -and io.js have a long, complex history and the facilitated discussions have -brought together key leaders to focus on what the future might mean for these -technologies. - -A lot of progress has been made in just a few short months, and we're -entering the final stages of discussions and decisions that will guide the -projects forward. Most recently [the io.js TC voted to join in the -Foundation](https://github.com/nodejs/node/issues/1705) effort and planning is -already underway to begin the process of converging the codebases. The neutral -organization, or foundation, will be a key element of that work and has been -discussed at length by those involved. When a technology and community reach a -level of maturity and adoption that outgrows one company or project, a -foundation becomes a critical enabler for ongoing growth. - -Foundations can be used to support industrial-scale open source projects that -require a legal entity to hold assets or conduct business (hiring, internship -programs, compliance, licensing trademarks, marketing and event services, -fundraising, etc). Ultimately foundations enable communities to participate in -large scale collaboration under agreed upon terms that no one company, person -or entity can change or dictate. - -It's important to note that while critical, an open governance model does not -guarantee success or growth. The io.js project has a strong developer -community, for example, but to grow further needs a model to enable funding -and investments in the project. If you haven't already, please [take a look -at Mikeal Rogers blog post](https://medium.com/node-js-javascript/growing-up-27d6cc8b7c53). -The Node.js community has needed an avenue for other companies -to participate as equals in a neutral field. rowing a community and widening -the adoption of a technology all takes resources and a governance model that -serves everyone involved. A foundation becomes the place where participants -can meet, agree on paths forward, ensure a neutral playing field in the -community and invest resources to grow the community even more. It can also -allow for broad community engagement through liberal contribution policies, -community self organization and working groups. - -At The Linux Foundation, we've helped set up neutral organizations that -support a variety of open source projects and communities through open and -neutral governance and believe the future is bright for the Node.js and io.js -communities. The technology being created has incredible value and expanding -use cases,which is why getting the governance model and defining the role of -the Foundation to support the developer community is the number one priority. - -While I'm a relative "newbie" to both the Node.js and io.js communities, I've -been able to identify with our team at Linux Foundation a number of -opportunities, as well as very common challenges in both communities that -relate to other projects we've helped before. What we've found is the -challenges the Node.js and io.js communities have are not unique; many open -source projects struggle with the same challenges and many have been -successful. As I've [previously written on -Linux.com](https://www.linux.com/news/featured-blogs/205-mike-dolan/763051-five-key-features-of-a-project-designed-for-open-collaboration), -there are five key features that we see in successful open governance: - -1. open participation -2. open, transparent technical decision making -3. open design and architecture -4. an open source license -5. an open, level playing field for intellectual property. - -I think these same features apply to the case for a foundation in the Node.js -and io.js communities. The io.js project has certainly been founded on many of -these principles and has taken off in terms of growing its developer -community. Many in the io.js community joined because they felt these -principles were not present elsewhere. For all of these reasons, we leveraged -the governance provisions from io.js to [draft proposals for the technical -community governance](https://github.com/joyent/nodejs-advisory-board/tree/master/governance-proposal). - -Now I'd like to share specific next steps for establishing the Node.js -Foundation (all of this is of course subject to change based on input from the -communities). We've started with a core group that offered advice on how to -address key governance issues. We've expanded the circle to the technical -committees of both communities and are now taking the discussion to the -entirety of both communities. - -1. Draft technical governance documents are [up for review and -comment](https://github.com/joyent/nodejs-advisory-board/tree/master/governance-proposal). - -2. The Foundation Bylaws and Membership Agreements based on our LF templates are -available for companies to sign up as members. There is no need to sign any -agreements as a community developer. If your company is interested in -participating, [now is the time to sign -up](http://f.cl.ly/items/0N1m3x0I3S2L203M1h1r/nodejs-foundation-membership-agreement-2015-march-04.pdf). - -3. Hold elections for the foundation's Gold and Silver member Board Directors and -the Technical Steering Committee elects a TSC Chair. The process typically -entails 1 week of nominations, 3-5 days of voting and then announcing the -election winners. - -4. Set up an initial Board meeting, likely mid-June. The first Board meeting will -put in place all of the key legal documents, policies, operations, etc that -are being discussed (the reason for wrapping up edits on May 8). - -5. Initiate TSC meetings under the new foundation by upon resolution of both -technical committees. The TSC will meet regularly on open, recorded calls. -Details will be posted on a foundation wiki or page. The combined io.js and -Node.js TCs have been meeting roughly every other week to work through the -[Convergence planning](https://github.com/jasnell/dev-policy/blob/6601ca1cd2886f336ac65ddb3f67d3e741a021c9/convergence.md). - -6. May 25 - June 5: Announce the new foundation, members, initial Board Directors -(elections may be pending), TSC members and any reconciliation plans agreed to -by the TSC (if ready). - -And so I ask both communities to review the ideas being proposed, including -how best to align goals, align resources and establish a platform for growing -adoption of an amazing technology the development community working to build. -I would like to thank the people building this future. Some you know; others -you do not. It takes a lot of personal strength to voice opinions and stand up -for new ideas in large communities. I appreciate the candor of the discussions -but also ask you to seek out those putting forth ideas to understand them and -to question them in a constructive dialogue. This community has another decade -or more ahead of it; now is the time to set the right foundational elements to -move forward. diff --git a/locale/fa/blog/community/node-v5.md b/locale/fa/blog/community/node-v5.md deleted file mode 100644 index 25df2f57ed11..000000000000 --- a/locale/fa/blog/community/node-v5.md +++ /dev/null @@ -1,54 +0,0 @@ ---- -title: What You Should Know about Node.js v5 and More -date: 2015-10-30T12:00:00.000Z -status: publish -category: Community -slug: node-v5 -layout: blog-post.hbs ---- - -## There’s Something New with Node.js Releases - -We just released [Node.js v5.0.0](https://nodejs.org/en/blog/release/v5.0.0/). You might be thinking to yourself: These folks just released [Node.js v4.2.1](https://nodejs.org/en/blog/release/v4.2.1/) “Argon,” under the new Long Term Support (LTS) plan, now I need to download this? The answer is yes and no. - -Node.js is growing, and growing fast. As we continue to innovate quickly, we will focus on two different release lines. One release line will fall under our **LTS** plan. All release lines that have LTS support will be even numbers, and (most importantly) focus on stability and security. These release lines are for organizations with complex environments that find it cumbersome to continually upgrade. We recently released the first in this line: [Node.js v4.2.1](https://nodejs.org/en/blog/release/v4.2.1/) “Argon.” - -The other release line is called **Current**. All release lines will be odd numbers, and have a shorter lifespan and more frequent updates to the code. The Current release line will focus on active development of necessary features and refinement of existing APIs. Node.js version 5 is this type of release. - -We want to make sure that you are adopting the release that best meets your Node.js needs, so to break it down: - -Stay on or upgrade to Node.js v4.2.x if you need stability and have a complex production environment, e.g. you are a medium or large enterprise. - -Upgrade to Node.js v5.x if you have the ability to upgrade versions quickly and easily without disturbing your environment. - -Now that you have the very basics, let’s take a deeper look at the new features and characteristics of v5, and the benefits and details of our LTS plan. - -## Introduction to Node.js v5 - -[Node.js v5](https://nodejs.org/en/blog/release/v5.0.0/) is an intermediate feature release line that is best suited for users who have an easier time upgrading their Node.js installations, such as developers using the technology for front-end toolchains. This version will be supported for a maximum of only eight months and will be continually updated with new features and better performance; it is not supported under our LTS plan. - -The release cadence for v5.x will be more rapid than in the past. Expect a new release once every one to two weeks for v5.x. If upgrading is a challenge for you, we suggest you do not use this release. There will be significant ongoing development. The focus is on getting the releases to users as soon as possible. - -npm has been upgraded to v3 in Node.js v5.0.0, which (amongst other changes) will install dependencies as flat as possible in node_modules. v5.0.0 also comes with V8 4.6, which ships the [new.target](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/new.target) and [spread operator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Spread_operator) JavaScript language features. If you want to learn more about other technical details around this, please check out our [release post](https://nodejs.org/en/blog/release/v5.0.0/). - -It’s another top-quality release from us, and we are averaging roughly 50 unique contributors per month to the codebase. We are extremely excited with all the enthusiasm and amazing work that is going into this Node.js v5 and future releases. - -## What Is Long Term Support and Why Does It Matter to Me? - -First and foremost, if you haven’t read the [Essential Steps: Long Term Support (LTS) for Node.js by Rod Vagg](https://medium.com/@nodesource/essential-steps-long-term-support-for-node-js-8ecf7514dbd#.hi7hosy92), Technical Steering Committee Chairperson at the Node.js Foundation and the Chief Node Officer at NodeSource, do so. It’s a very helpful source for understanding our release cycle process. If you only have two minutes now, here is a quick summary: - -* The point of establishing an LTS plan for Node.js is to build on top of an existing stable release cycle by delivering new versions on a predictable schedule that have a clearly defined extended support lifecycle. It is an essential requirement for enterprise application development and operations teams. It also affects companies that provide professional support for Node.js. - -* As stated above, the first LTS release line is v4 “Argon," beginning at v4.2.0 and currently standing at v4.2.1. The next LTS release line will begin in 12 months around the first week of October 2016. All LTS release lines will begin at the same time each year. - -* All LTS release lines are assigned a “codename” drawn from the names of the elements on the Periodic Table. - -* The LTS release line will be actively maintained for a period of 18 months from the date the LTS release line begins. After 18 months have passed, it will transition into Maintenance mode. - -* There will be no more than two active LTS release lines at any given time. Overlap is intended to help ease migration planning. - -* Once a Current release line becomes LTS, no new features or breaking changes will be added to that release. Changes are limited to bug fixes for stability, security updates, possible npm updates, documentation updates and certain performance improvements that can be demonstrated to not break existing applications. - -## Questions? - -If you have any questions you can always connect with us on our [help](https://github.com/nodejs/help) repository. If you encounter an issue log or bug with Node.js v5, please report to our main code repository [here](https://github.com/nodejs/node/issues). diff --git a/locale/fa/blog/community/quality-with-speed.md b/locale/fa/blog/community/quality-with-speed.md deleted file mode 100644 index f4e8710a2631..000000000000 --- a/locale/fa/blog/community/quality-with-speed.md +++ /dev/null @@ -1,330 +0,0 @@ ---- -title: Node.js - Quality with Speed -date: 2017-02-22T14:41:04.442Z -status: publish -category: Community -slug: quality-with-speed -layout: blog-post.hbs -author: Michael Dawson and Myles Borins ---- - -# Node.js - Quality with Speed - -One of the key tenets of the Node.js community is to allow change -at a rapid pace in order to foster innovation and to allow Node.js -to be used in a growing number of use cases. - -At the same time the community values quality. Newer versions of -the runtime must be as good or better than earlier versions and must -not un-intentionally break existing applications. - -Instead of trading off one for the other, the community looks for the path -that allows us to maintain our rate of change while ensuring the -required level of quality. - -Many of the activities undertaken by the community over the last year -are in support of this goal. - -This is our take on how these activities fit together. - -# Key strategies - -Several key strategies are in place to build the safety -net in order to enable change/innovation while maintaining -quality. These include: - -* Different release types -* Change flow processes -* Enhancement Proposal process -* Automation and Testing - * Functional Tests - * Module Testing - * Stress Testing - * Platform/OS coverage - * Development Workflows -* Performance Benchmarks -* Tools - -# Release Types - -The Node.js project maintains 3 key types of releases -* Nightlies -* Current -* LTS - -Having different release types allows innovation/change -to flow rapidly into Nightly builds where we can get -early feedback on upcoming changes. -When ready, these changes then transition -to Current and LTS releases in a more controlled manner, -such that the level of quality/stability increases at -each level. - -## Nightlies - -These are built from master and contain the very latest changes -that have been landed. If you want to try out the bleeding edge -these are the binaries to use. There is no additional testing -on these releases, however, the standard Node.js unit tests are -run for each change landed so these will most often be usable. - -## Current - -Changes which have landed in master are backported to Current -on a regular basis. In general all changes that land in master -will be backported to Current, however there may be a lag if -there are specific concerns or for larger changes where the community -believes more soak time is required. One key exception is -that semver breaking changes will not be backported until the -next major version (ex 5 -> 6). This includes v8 and other -components such that the expectation is that an application/module -written to run on a major level will continue to do so. - -These releases are documented in the changelog so -there is more visibility with respect to the changes in each release. -Current releases are created roughly every 1-2 weeks. - -In addition to the the regular Node.js unit tests, CITGM (see -later sections) is run on Current releases. - -If you want to try out the latest with a reasonable expectation -that your application will continue to run, these are the releases -to use. - -## LTS - -Once changes have been proven in the Current stream, they are candidates -for the LTS streams. In the first stage of LTS (Active) -changes are limited to: - -* Bug fixes -* Security updates -* Non-semver-major npm updates -* Relevant documentation updates -* Certain performance improvements where the risk of - breaking existing applications is minimal -* Changes that introduce large amount of code churn where - the risk of breaking existing applications is low and - where the change in question may significantly ease the - ability to backport future changes due to the reduction in diff noise. - -Further, in the second stage of an LTS release (Maintenance), only -**critical** bugs and **critical** security fixes will be included. - -Like Current releases, CITGM (see -later sections) is run on LTS releases. In addition we also -track performance through nightly benchmarks reported on -[benchmarking.nodejs.org](https://benchmarking.nodejs.org) (See later sections). - -You can read more about the LTS releases [here](https://github.com/nodejs/lts). - -If you want the best level of stability/quality for your production -applications these are the releases to use. - -# Change flow processes - -We've already touched on this in the discussion on the different release -types but we'll expand on this strategy here. - -The main idea is that as changes flow from Nightlies, to Stable, to LTS -Active, to LTS Maintenance we increase the following: - -* scrutiny -* time - -Changes going into master are well reviewed and time is allowed -(minimum 48 to 72 hours) for as many community members as possible -to comment and review. However, as we all know, some problems -will still get through. - -Before changes are pulled into Current from the Nightly builds, they will have -spent more time in master where intermittent issues may surface in the -ongoing regressions runs and this provides time where users may more fully -exercise the release and report issues. Further, there is an additional -review/sanity check that they are not breaking as they are pulled over to -Current. - -Similarly, before changes are pulled into an LTS update release, -they must have been in a -Current release version for at least a week, and are often left longer. -This provides additional time where users may more fully -exercise the changes and report issues. In addition, changes are more -carefully reviewed as they are pulled into LTS, again reducing the -chance that unintentional breaking changes make it through. As an LTS -release ages, particularly once it reaches maintenance, the scope of -changes that will be pulled in narrows, further reducing the risk. - -When it comes to new LTS versions, changes will have soaked in the latest -release for up to 6 months. In particular, larger changes like an upgrade -to v8 are done early in the lifespan of the stream such that they will have -significant soaking and usage in the Current stream before they make it -into an LTS release. - -This strategy allows for rapid innovation/change, with releases being available -where those changes can be used/validated and a funnel through which -these can flow in an appropriate manner into releases used by more -risk-averse community members. - -# Enhancement Proposal Process - -Some changes are of such scope that they cannot simply be reviewed in a -pull request. There are often larger questions that will factor into the -decision as to whether the change proposed is desirable or appropriate -for the Node.js runtime. - -The strategy for these changes is the "enhancement proposal" process. The -proposed change is documented, discussed and moves through a number of -stages including DRAFT and ACCEPTED or REJECTED. You can read more on -the process [here](https://github.com/nodejs/node-eps#progress-of-an-ep). - -This process ensures that larger changes can be discussed in advance and agreed -by the community, allowing the final review of the pull request to focus -on implementation. The result being that the merits of the concept can be -discussed at the appropriate level of abstraction without having to -review all of the technical details. - - -# Automation and Testing - -Automation and Testing are key strategies that go hand in hand in allowing -rapid change in a safe manner. - -Automation avoids error-prone manual steps. Once you have a task automated -the likelihood of errors is orders of magnitude smaller than doing those -tasks by hand, particularly when those tasks are done by different -individuals. - -One of our key tenets is to automate as much as we can. This ranges all -the way from the configuration of the machines in our build infrastructure -using Ansible, to automated jobs that build/sign/and release our binaries. - -Automated Testing allows tests to be run often enough to catch regressions -quickly and reliably. Given a good set of tests, we can make changes -confident that we'll catch changes which introduce regressions. - -There are many levels of testing and the strategy is to build our way -up the levels until we have as complete coverage as is reasonable. - -These levels include: - -* Functional Tests -* Platform/OS Coverage -* Dependency Testing -* Module Testing -* Stress Testing -* Development Workflows -* Use Case Testing - - -## Functional Tests - -Functional tests are the first level of defense. Our collaborator guidelines -require test cases for all new features added, and our collaborators set a -high standard in respect to requiring tests. - -It is not enough to simply have tests, those tests must be effective at -exercising the runtime. We measure code coverage nightly and publish -the results at [coverage.nodejs.org](https://coverage.nodejs.org/). -This allows us to ensure our tests remain effective and provides the data -necessary to further improve our tests. - -You'll also notice that there has been a lot of effort put into making sure -the tests pass reliably and consistently. If you watch the continuous -integration (ci) runs you will see that they are mostly green -and intermittent failures are rare. - -## Platform/OS Coverage - -This is not a type of test by itself. But by applying the strategy of -running tests across a broad range of platforms and OS types and levels it -multiplies the effectiveness of the existing tests. - -Issues which surface on a particular platform or OS often are not specific -to that platform or OS but instead are uncovered because of different timing, -default configuration or general environment. They could have occurred on any -of the other platforms. - -Our strategy is to test on a broad range of platforms both to ensure Node.js -works on our supported platforms, but also to leverage the diversity to -uncover as many problems as early as possible. - -## Dependency Testing - -Node.js has a number of key dependencies. It's important that we ensure -that any changes we apply to those dependencies don't have a negative effect. - -To this end we have a job which runs the v8 tests on the v8 tree within -the Node.js repo. This job runs nightly and on request for PRs that are -making changes to the v8 tree. - -We don't currently run the tests for other dependencies, but the delta in -the Node.js tree for the dependencies other than v8 is more limited. - -## Module Tests - -Module tests are the next level of defense. They help to validate that -changes are not going to break for end users. Most applications use -a number of modules, and many of the most popular modules are extensively -used. Any changes that impact those modules would have a -significant community impact. - -Our strategy is to run the module's own unit tests on a set of key modules -and to run these as often as possible. Currently they are run for -Current and LTS releases and we are working to increase that frequency. - -You can read more about our module testing efforts in -[https://github.com/nodejs/citgm/](https://github.com/nodejs/citgm/). - -## Stress Tests - -Some problems only surface after running for a long time. Stress tests help -to flush those out by running certain scearios over a prolonged period -of time. - -We don't have any stress tests running at this point but it will be our next -priority after we have module testing running at an appropriate frequency. - -## Development Workflows - -Development Workflows is another level up from Module Testing. It aims -to test common development workflows to ensure changes will not introduce -any regressions to those flows. - -These are more work to put in place and run but they will be next on our -list after getting stress tests in place. - -## Use Case Testing - -This would be the next logical step after Development Workflows, testing -for the common use cases for Node.js. - -Our current strategy is to get some of this coverage through the -benchmarking that we put in place, but it is another area we can work -on once we have the other levels of testing in place. - - - -# Performance Benchmarks - -While ensuring functional stability is good, its not enough. We also need -to make sure that performance is not degraded as changes flow in. - -Our strategy is to define the common use cases for Node.js and then -build up a set of benchmarks that we run and publish results for on a -regular basis. This work is ongoing in the -[Benchmarking Working Group](https://github.com/nodejs/benchmarking), -but we already have a number of key benchmarks being run nightly -across the major Node.js versions. You can view this data at: - -[https://benchmarking.nodejs.org/](https://benchmarking.nodejs.org/). - -This data allows us to ensure we avoid performance regressions as -changes flow in. - -# In Summary - -This may have been a bit of a long read but I hope it has put a number -of the activities you may have seen in the Node.js community over the last -year into context. If you ever wondered "Why are they doing that?", the answer is: - - **Node.js - Quality with Speed** diff --git a/locale/fa/blog/community/transitions.md b/locale/fa/blog/community/transitions.md deleted file mode 100644 index a6803725bc2f..000000000000 --- a/locale/fa/blog/community/transitions.md +++ /dev/null @@ -1,41 +0,0 @@ ---- -title: Transitions -author: Scott Hammond -date: 2015-05-08T18:00:00.000Z -status: publish -category: Community -slug: transitions -layout: blog-post.hbs ---- - -In February, we announced the [Node.js -Foundation](https://www.joyent.com/blog/introducing-the-nodejs-foundation), -which will steward Node.js moving forward and open its future up to the -community in a fashion that has not been available before. Organizations like -IBM, SAP, Apigee, F5, Fidelity, Microsoft, PayPal, Red Hat, and others are -sponsoring the Foundation, and they’re adding more contributors to the project. -The mission of the Foundation is to accelerate the adoption of Node and ensure -that the project is driven by the community under a transparent, open governance -model. - -Under the aegis of the Foundation, the Node.js project is entering the next -phase of maturity and adopting a model in which there is no BD or project lead. -Instead, the technical direction of the project will be established by a -technical steering committee run with an open governance model. There has been a -lot of discussion on the dev policies and [governance -model](https://github.com/joyent/nodejs-advisory-board/tree/master/governance-proposal) -on Github. As we move toward the Foundation model, the core team on Node.js is -already adopting some of these policies [as shown -here](https://github.com/joyent/node-website/pull/111). - -As we open a new chapter with the Foundation, we also close a remarkable chapter -in Node.js, as TJ Fontaine will be stepping back from his post as Node.js -Project Lead. TJ has come to be an integral member of our team, and his -contributions will have long-lasting effects on the future of Node.js. Although -he will not be as active, TJ will continue to act as a resource for helping the -Node.js project as needed. - -I would like to thank TJ for his time and contributions to Node.js and to -Joyent. I have witnessed firsthand the kind of impact he can have on a team, and -his technical chops will be missed. As we take this next major step in the -growth of Node.js, we wish TJ luck in his future endeavors. diff --git a/locale/fa/blog/community/update-v8-5.4.md b/locale/fa/blog/community/update-v8-5.4.md deleted file mode 100644 index 0769a68ed166..000000000000 --- a/locale/fa/blog/community/update-v8-5.4.md +++ /dev/null @@ -1,85 +0,0 @@ ---- -title: Node.js v7 has updated V8 to 5.4 -date: 2016-12-03T14:41:04.442Z -status: publish -category: Announcements -slug: node-v7-v8-54 -layout: blog-post.hbs -author: Michaël Zasso ---- - -With the release of Node.js 7.0.0, the V8 JavaScript engine has been upgraded from 5.1 -to its latest stable version, 5.4. -It brings in new language features and increased performance. - -## New ECMAScript features - -### Exponentiation operator (ES2016) - -* [Proposal](https://github.com/rwaldron/exponentiation-operator) -* [Spec](https://www.ecma-international.org/ecma-262/7.0/#sec-exp-operator) - -The `**` operator can now be used to raise the left-hand side to the power of the right-hand side. Example: - -```javascript -const maxInt = 2**32 - 1; // Equivalent to: Math.pow(2, 32) - 1 -``` - -### Object.values / Object.entries (ES2017) - -* [Proposal](https://github.com/tc39/proposal-object-values-entries) -* [Spec (draft)](https://tc39.github.io/ecma262/#sec-object.values) - -Complementing `Object.keys`, those two new static methods return respectively an Array of enumerable own property values -or entries (an entry being an array with two elements: key and value). Example: - -```javascript -const obj = { - x: 0, - y: 100 -}; - -const keys = Object.keys(obj); // ['x', 'y'] -const values = Object.values(obj); // [0, 100] -const entries = Object.entries(obj); // [['x', 0], ['y', 100]] -``` - -### Object.getOwnPropertyDescriptors (ES2017) - -* [Proposal](https://github.com/tc39/proposal-object-getownpropertydescriptors) -* [Spec (draft)](https://tc39.github.io/ecma262/#sec-object.getownpropertydescriptors) - -Returns all own property descriptors of an Object in a new Object, mapped by their respective key. Example: - -```javascript -const obj = { - x: 0, - y: 100 -}; - -const descriptors = Object.getOwnPropertyDescriptors(obj); -/* -{ x: {value: 0, writable: true, enumerable: true, configurable: true}, - y: {value: 100, writable: true, enumerable: true, configurable: true} } -*/ -``` - -## Performance and memory optimizations - -### From [V8 5.2](https://v8project.blogspot.ch/2016/06/release-52.html) - -Improvement of JavaScript built-ins, including: -* `Array` operations like the `isArray` method. -* The `in` operator. -* `Function.prototype.bind`. - -### From [V8 5.3](https://v8project.blogspot.ch/2016/07/v8-release-53.html) - -* The new Ignition interpreter is now feature complete and can be tested with the flag `--ignition`. Read the [blog post](https://v8project.blogspot.ch/2016/08/firing-up-ignition-interpreter.html) from V8's team for more information. -* The garbage collector has been improved and full garbage collection pause times can be reduced up to 25%. -* Improvement of ES6 Promise performance. - -### From [V8 5.4](https://v8project.blogspot.ch/2016/09/v8-release-54.html) - -* Reduced on-heap peak memory consumption on low-memory devices up to 40%. -* Optimizations in V8's parser allowed to reduce off-heap peak memory consumption up to 20% and improve startup performance. diff --git a/locale/fa/blog/community/v5-to-v7.md b/locale/fa/blog/community/v5-to-v7.md deleted file mode 100644 index 0287cb0f9250..000000000000 --- a/locale/fa/blog/community/v5-to-v7.md +++ /dev/null @@ -1,46 +0,0 @@ ---- -title: Farewell to Node.js v5, Preparing for v7 -date: 2016-09-06T23:36:16.645Z -status: publish -category: Annoucements -slug: v5-to-v7 -layout: blog-post.hbs -author: Rod Vagg ---- - -You may have missed it but at the end of June, the Node.js project said a final farewell to version 5. There will be no more patches, critical or otherwise, for this branch. To those who have been using Node.js for some time this may seem anomalous, shouldn't major versions stick around for _years_? - -## We have a plan! - -![LTS Schedule Summary](/static/images/blog/201609_lts_schedule_summary.gif) - -Last year, the core team devised a Long-term Support (LTS) and release plan to balance the various wants and needs expressed by Node.js users. Chief among those were: - -1. Stability -2. Progress - -The io.js diversion was useful for many reasons, including the opportunity we had to lean into this "progress" thing. We learned that there is a necessary trade-off between "stability" and the rapid iteration of the platform. Some of it was manageable but much was unavoidable. Breaking the entire C++ add-on ecosystem each time we upgraded V8 turned out to be quite painful for the Node.js package ecosystem. This is due to the heavy reliance on compiled native components in Node.js userland and the difficulty Node.js has had in maintaining [API](https://en.wikipedia.org/wiki/Application_programming_interface) and [ABI](https://en.wikipedia.org/wiki/Application_binary_interface) stability while upgrading V8. - -On the flip side, it was clear that v0.10 went on far too long and the slow downward trend in release frequency was hurting the platform's reputation for being innovative and _modern_ and was preventing iteration on the features and fixes that Node.js actually needed. This was one of the key reasons io.js even existed. - -So, all this experience and history put us in a position to try and formulate a plan for combining both stability and progress. We didn't just find a compromise, we found a way that these often competing goals could coexist. - -## Which brings me to Node.js v5. - -Every 6 months, we plan to release a new _major_ version of Node.js. The version is _major_ in the [semver](http://semver.org/) sense in that we hold back breaking changes on our master branch until the 6 month point where we can release them together in a batch. The creation of these new release lines occur during April and October each year. Even version numbers happen to come in the April release while odd version numbers are in the October release. - -Each major version of Node.js has an active life of 6 months in what we are now calling "Current". During this period we ship most of the active work that goes in to the Node.js codebase except for some items that we reserve for the next major release. Node.js version 5 was first released in October last year, so its "Current" period ended in April this year. At the end of this 6 month period, something different happens for odd and even versioned release lines. The even versions turn in to LTS and receive another 30 months of life; this happened for version 4 in October last year and will happen for version 6 in October this year. The odd versions, however, don't get this extended life. Instead, as a transitionary measure, we provide another 2 months of support where we'll ensure that important fixes make it into that release line. - -And this is exactly what happened to version 5. It lived as _Current_ for 6 months from October, 2015 to April, 2016 and then in a special Maintenance phase for another 2 months until June, 2016. At the end of June, we ceased supporting Node.js version 5 and it will no longer receive any fixes or updates from the core team _(although you're welcome to play with the `v5.x` branch on the [Node.js repository](https://github.com/nodejs/node) if it's important to you!)_ - -The core team is focusing all of its activities on the following release lines: - -* v0.10 which will receive occasional critical fixes during its current Maintenance phase and will cease to be supported in October this year. -* v0.12 which will receive occasional critical fixes during its current Maintenance phase and will cease to be supported in December this year. -* v4 which is in Active LTS and is receiving more regular patches and occasional important feature additions, this will continue until October 2017 where it will switch to Maintenance and operate in a manner similar to v0.10 and v0.12 until April 2018. -* v6 which is still a Current release, due to become our second LTS release in October where its life will continue under Active LTS and Maintenance until April 2019. -* v7 is being planned for a release in October this year at the same time that we switch v6 to LTS. You can already try out nightly builds from our `master` branch at but expect to see a focus on quality and stability of these in the coming months as we create a `v7.x` branch and becoming more choosy about what gets to make it in to v7.0.0. - -It sounds like a lot, but once we move beyond the legacy v0.12 and v0.10 release lines we expect the steady cadence of major versions and their various releases to become easier to understand. - -Armed with this knowledge, what's next for you? We suggest you make a judgement on the stability and quality requirements for your own use of Node.js and pick a release line that suits. For production deployments of Node.js we generally recommend version 4 where stability is taken very seriously. For everyday development, non-critical deployments and where Node.js is used as part of a toolchain (e.g. for building frontend components), a Current release should work just fine. We'd love your help testing nightly builds of the next major version of Node.js and while we do continuous unit testing and smoke testing of our `master` branch, we can't provide any assurances of stability or quality of these nightly builds, so buyer beware. diff --git a/locale/fa/blog/feature/index.md b/locale/fa/blog/feature/index.md deleted file mode 100644 index 12d3702b0946..000000000000 --- a/locale/fa/blog/feature/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Features -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/feature/streams2.md b/locale/fa/blog/feature/streams2.md deleted file mode 100644 index b5cc6daf198a..000000000000 --- a/locale/fa/blog/feature/streams2.md +++ /dev/null @@ -1,854 +0,0 @@ ---- -title: A New Streaming API for Node v0.10 -author: Isaac Z. Schlueter -date: 2012-12-21T00:45:13.000Z -slug: streams2 -category: feature -layout: blog-post.hbs ---- - -**tl;dr** - -* Node streams are great, except for all the ways in which they're - terrible. -* A new Stream implementation is coming in 0.10, that has gotten the - nickname "streams2". -* Readable streams have a `read()` method that returns a buffer or - null. (More documentation included below.) -* `'data'` events, `pause()`, and `resume()` will still work as before - (except that they'll actully work how you'd expect). -* Old programs will **almost always** work without modification, but - streams start out in a paused state, and need to be read from to be - consumed. -* **WARNING**: If you never add a `'data'` event handler, or call - `resume()`, then it'll sit in a paused state forever and never - emit `'end'`. - -------- - -Throughout the life of Node, we've been gradually iterating on the -ideal event-based API for handling data. Over time, this developed -into the "Stream" interface that you see throughout Node's core -modules and many of the modules in npm. - -Consistent interfaces increase the portability and reliability of our -programs and libraries. Overall, the move from domain-specific events -and methods towards a unified stream interface was a huge win. -However, there are still several problems with Node's streams as of -v0.8. In a nutshell: - -1. The `pause()` method doesn't pause. It is advisory-only. In - Node's implementation, this makes things much simpler, but it's - confusing to users, and doesn't do what it looks like it does. -2. `'data'` events come right away (whether you're ready or not). - This makes it unreasonably difficult to do common tasks like load a - user's session before deciding how to handle their request. -3. There is no way to consume a specific number of bytes, and then - leave the rest for some other part of the program to deal with. -4. It's unreasonably difficult to implement streams and get all the - intricacies of pause, resume, write-buffering, and data events - correct. The lack of shared classes mean that we all have to solve - the same problems repeatedly, making similar mistakes and similar - bugs. - -Common simple tasks should be easy, or we aren't doing our job. -People often say that Node is better than most other platforms at this -stuff, but in my opinion, that is less of a compliment and more of an -indictment of the current state of software. Being better than the -next guy isn't enough; we have to be the best imaginable. While they -were a big step in the right direction, the Streams in Node up until -now leave a lot wanting. - -So, just fix it, right? - -Well, we are sitting on the results of several years of explosive -growth in the Node community, so any changes have to be made very -carefully. If we break all the Node programs in 0.10, then no one -will ever want to upgrade to 0.10, and it's all pointless. We had -this conversation around 0.4, then again around 0.6, then again around -0.8. Every time, the conclusion has been "Too much work, too hard to -make backwards-compatible", and we always had more pressing problems -to solve. - -In 0.10, we cannot put it off any longer. We've bitten the bullet and -are making a significant change to the Stream implementation. You may -have seen conversations on twitter or IRC or the mailing list about -"streams2". I also gave [a talk in -November](https://dl.dropbox.com/u/3685/presentations/streams2/streams2-ko.pdf) -about this subject. A lot of node module authors have been involved -with the development of streams2 (and of course the node core team). - -## streams2 - -The feature is described pretty thoroughly in the documentation, so -I'm including it below. Please read it, especially the section on -"compatibility". There's a caveat there that is unfortunately -unavoidable, but hopefully enough of an edge case that it's easily -worked around. - -The first preview release with this change will be 0.9.4. I highly -recommend trying this release and providing feedback before it lands -in a stable version. - -As of writing this post, there are some known performance regressions, -especially in the http module. We are fanatical about maintaining -performance in Node.js, so of course this will have to be fixed before -the v0.10 stable release. (Watch for a future blog post on the tools -and techniques that have been useful in tracking down these issues.) - -There may be minor changes as necessary to fix bugs and improve -performance, but the API at this point should be considered feature -complete. It correctly does all the things we need it to do, it just -doesn't do them quite well enough yet. As always, be wary of running -unstable releases in production, of course, but I encourage you to try -it out and see what you think. Especially, if you have tests that you -can run on your modules and libraries, that would be extremely useful -feedback. - --------- - -# Stream - - Stability: 2 - Unstable - -A stream is an abstract interface implemented by various objects in -Node. For example a request to an HTTP server is a stream, as is -stdout. Streams are readable, writable, or both. All streams are -instances of [EventEmitter][] - -You can load the Stream base classes by doing `require('stream')`. -There are base classes provided for Readable streams, Writable -streams, Duplex streams, and Transform streams. - -## Compatibility - -In earlier versions of Node, the Readable stream interface was -simpler, but also less powerful and less useful. - -* Rather than waiting for you to call the `read()` method, `'data'` - events would start emitting immediately. If you needed to do some - I/O to decide how to handle data, then you had to store the chunks - in some kind of buffer so that they would not be lost. -* The `pause()` method was advisory, rather than guaranteed. This - meant that you still had to be prepared to receive `'data'` events - even when the stream was in a paused state. - -In Node v0.10, the Readable class described below was added. For -backwards compatibility with older Node programs, Readable streams -switch into "old mode" when a `'data'` event handler is added, or when -the `pause()` or `resume()` methods are called. The effect is that, -even if you are not using the new `read()` method and `'readable'` -event, you no longer have to worry about losing `'data'` chunks. - -Most programs will continue to function normally. However, this -introduces an edge case in the following conditions: - -* No `'data'` event handler is added. -* The `pause()` and `resume()` methods are never called. - -For example, consider the following code: - -```javascript -// WARNING! BROKEN! -net.createServer(function(socket) { - - // we add an 'end' method, but never consume the data - socket.on('end', function() { - // It will never get here. - socket.end('I got your message (but didnt read it)\n'); - }); - -}).listen(1337); -``` - -In versions of node prior to v0.10, the incoming message data would be -simply discarded. However, in Node v0.10 and beyond, the socket will -remain paused forever. - -The workaround in this situation is to call the `resume()` method to -trigger "old mode" behavior: - -```javascript -// Workaround -net.createServer(function(socket) { - - socket.on('end', function() { - socket.end('I got your message (but didnt read it)\n'); - }); - - // start the flow of data, discarding it. - socket.resume(); - -}).listen(1337); -``` - -In addition to new Readable streams switching into old-mode, pre-v0.10 -style streams can be wrapped in a Readable class using the `wrap()` -method. - -## Class: stream.Readable - - - -A `Readable Stream` has the following methods, members, and events. - -Note that `stream.Readable` is an abstract class designed to be -extended with an underlying implementation of the `_read(size)` -method. (See below.) - -### new stream.Readable([options]) - -* `options` {Object} - * `highWaterMark` {Number} The maximum number of bytes to store in - the internal buffer before ceasing to read from the underlying - resource. Default=16kb - * `encoding` {String} If specified, then buffers will be decoded to - strings using the specified encoding. Default=null - * `objectMode` {Boolean} Whether this stream should behave - as a stream of objects. Meaning that stream.read(n) returns - a single value instead of a Buffer of size n - -In classes that extend the Readable class, make sure to call the -constructor so that the buffering settings can be properly -initialized. - -### readable.\_read(size) - -* `size` {Number} Number of bytes to read asynchronously - -Note: **This function should NOT be called directly.** It should be -implemented by child classes, and called by the internal Readable -class methods only. - -All Readable stream implementations must provide a `_read` method -to fetch data from the underlying resource. - -This method is prefixed with an underscore because it is internal to -the class that defines it, and should not be called directly by user -programs. However, you **are** expected to override this method in -your own extension classes. - -When data is available, put it into the read queue by calling -`readable.push(chunk)`. If `push` returns false, then you should stop -reading. When `_read` is called again, you should start pushing more -data. - -The `size` argument is advisory. Implementations where a "read" is a -single call that returns data can use this to know how much data to -fetch. Implementations where that is not relevant, such as TCP or -TLS, may ignore this argument, and simply provide data whenever it -becomes available. There is no need, for example to "wait" until -`size` bytes are available before calling `stream.push(chunk)`. - -### readable.push(chunk) - -* `chunk` {Buffer | null | String} Chunk of data to push into the read queue -* return {Boolean} Whether or not more pushes should be performed - -Note: **This function should be called by Readable implementors, NOT -by consumers of Readable subclasses.** The `_read()` function will not -be called again until at least one `push(chunk)` call is made. If no -data is available, then you MAY call `push('')` (an empty string) to -allow a future `_read` call, without adding any data to the queue. - -The `Readable` class works by putting data into a read queue to be -pulled out later by calling the `read()` method when the `'readable'` -event fires. - -The `push()` method will explicitly insert some data into the read -queue. If it is called with `null` then it will signal the end of the -data. - -In some cases, you may be wrapping a lower-level source which has some -sort of pause/resume mechanism, and a data callback. In those cases, -you could wrap the low-level source object by doing something like -this: - -```javascript -// source is an object with readStop() and readStart() methods, -// and an `ondata` member that gets called when it has data, and -// an `onend` member that gets called when the data is over. - -var stream = new Readable(); - -source.ondata = function(chunk) { - // if push() returns false, then we need to stop reading from source - if (!stream.push(chunk)) - source.readStop(); -}; - -source.onend = function() { - stream.push(null); -}; - -// _read will be called when the stream wants to pull more data in -// the advisory size argument is ignored in this case. -stream._read = function(n) { - source.readStart(); -}; -``` - -### readable.unshift(chunk) - -* `chunk` {Buffer | null | String} Chunk of data to unshift onto the read queue -* return {Boolean} Whether or not more pushes should be performed - -This is the corollary of `readable.push(chunk)`. Rather than putting -the data at the *end* of the read queue, it puts it at the *front* of -the read queue. - -This is useful in certain use-cases where a stream is being consumed -by a parser, which needs to "un-consume" some data that it has -optimistically pulled out of the source. - -```javascript -// A parser for a simple data protocol. -// The "header" is a JSON object, followed by 2 \n characters, and -// then a message body. -// -// Note: This can be done more simply as a Transform stream. See below. - -function SimpleProtocol(source, options) { - if (!(this instanceof SimpleProtocol)) - return new SimpleProtocol(options); - - Readable.call(this, options); - this._inBody = false; - this._sawFirstCr = false; - - // source is a readable stream, such as a socket or file - this._source = source; - - var self = this; - source.on('end', function() { - self.push(null); - }); - - // give it a kick whenever the source is readable - // read(0) will not consume any bytes - source.on('readable', function() { - self.read(0); - }); - - this._rawHeader = []; - this.header = null; -} - -SimpleProtocol.prototype = Object.create( - Readable.prototype, { constructor: { value: SimpleProtocol }}); - -SimpleProtocol.prototype._read = function(n) { - if (!this._inBody) { - var chunk = this._source.read(); - - // if the source doesn't have data, we don't have data yet. - if (chunk === null) - return this.push(''); - - // check if the chunk has a \n\n - var split = -1; - for (var i = 0; i < chunk.length; i++) { - if (chunk[i] === 10) { // '\n' - if (this._sawFirstCr) { - split = i; - break; - } else { - this._sawFirstCr = true; - } - } else { - this._sawFirstCr = false; - } - } - - if (split === -1) { - // still waiting for the \n\n - // stash the chunk, and try again. - this._rawHeader.push(chunk); - this.push(''); - } else { - this._inBody = true; - var h = chunk.slice(0, split); - this._rawHeader.push(h); - var header = Buffer.concat(this._rawHeader).toString(); - try { - this.header = JSON.parse(header); - } catch (er) { - this.emit('error', new Error('invalid simple protocol data')); - return; - } - // now, because we got some extra data, unshift the rest - // back into the read queue so that our consumer will see it. - var b = chunk.slice(split); - this.unshift(b); - - // and let them know that we are done parsing the header. - this.emit('header', this.header); - } - } else { - // from there on, just provide the data to our consumer. - // careful not to push(null), since that would indicate EOF. - var chunk = this._source.read(); - if (chunk) this.push(chunk); - } -}; - -// Usage: -var parser = new SimpleProtocol(source); -// Now parser is a readable stream that will emit 'header' -// with the parsed header data. -``` - -### readable.wrap(stream) - -* `stream` {Stream} An "old style" readable stream - -If you are using an older Node library that emits `'data'` events and -has a `pause()` method that is advisory only, then you can use the -`wrap()` method to create a Readable stream that uses the old stream -as its data source. - -For example: - -```javascript -var OldReader = require('./old-api-module.js').OldReader; -var oreader = new OldReader; -var Readable = require('stream').Readable; -var myReader = new Readable().wrap(oreader); - -myReader.on('readable', function() { - myReader.read(); // etc. -}); -``` - -### Event: 'readable' - -When there is data ready to be consumed, this event will fire. - -When this event emits, call the `read()` method to consume the data. - -### Event: 'end' - -Emitted when the stream has received an EOF (FIN in TCP terminology). -Indicates that no more `'data'` events will happen. If the stream is -also writable, it may be possible to continue writing. - -### Event: 'data' - -The `'data'` event emits either a `Buffer` (by default) or a string if -`setEncoding()` was used. - -Note that adding a `'data'` event listener will switch the Readable -stream into "old mode", where data is emitted as soon as it is -available, rather than waiting for you to call `read()` to consume it. - -### Event: 'error' - -Emitted if there was an error receiving data. - -### Event: 'close' - -Emitted when the underlying resource (for example, the backing file -descriptor) has been closed. Not all streams will emit this. - -### readable.setEncoding(encoding) - -Makes the `'data'` event emit a string instead of a `Buffer`. `encoding` -can be `'utf8'`, `'utf16le'` (`'ucs2'`), `'ascii'`, or `'hex'`. - -The encoding can also be set by specifying an `encoding` field to the -constructor. - -### readable.read([size]) - -* `size` {Number | null} Optional number of bytes to read. -* Return: {Buffer | String | null} - -Note: **This function SHOULD be called by Readable stream users.** - -Call this method to consume data once the `'readable'` event is -emitted. - -The `size` argument will set a minimum number of bytes that you are -interested in. If not set, then the entire content of the internal -buffer is returned. - -If there is no data to consume, or if there are fewer bytes in the -internal buffer than the `size` argument, then `null` is returned, and -a future `'readable'` event will be emitted when more is available. - -Calling `stream.read(0)` will always return `null`, and will trigger a -refresh of the internal buffer, but otherwise be a no-op. - -### readable.pipe(destination, [options]) - -* `destination` {Writable Stream} -* `options` {Object} Optional - * `end` {Boolean} Default=true - -Connects this readable stream to `destination` WriteStream. Incoming -data on this stream gets written to `destination`. Properly manages -back-pressure so that a slow destination will not be overwhelmed by a -fast readable stream. - -This function returns the `destination` stream. - -For example, emulating the Unix `cat` command: - - process.stdin.pipe(process.stdout); - -By default `end()` is called on the destination when the source stream -emits `end`, so that `destination` is no longer writable. Pass `{ end: -false }` as `options` to keep the destination stream open. - -This keeps `writer` open so that "Goodbye" can be written at the -end. - - reader.pipe(writer, { end: false }); - reader.on("end", function() { - writer.end("Goodbye\n"); - }); - -Note that `process.stderr` and `process.stdout` are never closed until -the process exits, regardless of the specified options. - -### readable.unpipe([destination]) - -* `destination` {Writable Stream} Optional - -Undo a previously established `pipe()`. If no destination is -provided, then all previously established pipes are removed. - -### readable.pause() - -Switches the readable stream into "old mode", where data is emitted -using a `'data'` event rather than being buffered for consumption via -the `read()` method. - -Ceases the flow of data. No `'data'` events are emitted while the -stream is in a paused state. - -### readable.resume() - -Switches the readable stream into "old mode", where data is emitted -using a `'data'` event rather than being buffered for consumption via -the `read()` method. - -Resumes the incoming `'data'` events after a `pause()`. - - -## Class: stream.Writable - - - -A `Writable` Stream has the following methods, members, and events. - -Note that `stream.Writable` is an abstract class designed to be -extended with an underlying implementation of the -`_write(chunk, encoding, cb)` method. (See below.) - -### new stream.Writable([options]) - -* `options` {Object} - * `highWaterMark` {Number} Buffer level when `write()` starts - returning false. Default=16kb - * `decodeStrings` {Boolean} Whether or not to decode strings into - Buffers before passing them to `_write()`. Default=true - -In classes that extend the Writable class, make sure to call the -constructor so that the buffering settings can be properly -initialized. - -### writable.\_write(chunk, encoding, callback) - -* `chunk` {Buffer | String} The chunk to be written. Will always - be a buffer unless the `decodeStrings` option was set to `false`. -* `encoding` {String} If the chunk is a string, then this is the - encoding type. Ignore chunk is a buffer. Note that chunk will - **always** be a buffer unless the `decodeStrings` option is - explicitly set to `false`. -* `callback` {Function} Call this function (optionally with an error - argument) when you are done processing the supplied chunk. - -All Writable stream implementations must provide a `_write` method to -send data to the underlying resource. - -Note: **This function MUST NOT be called directly.** It should be -implemented by child classes, and called by the internal Writable -class methods only. - -Call the callback using the standard `callback(error)` pattern to -signal that the write completed successfully or with an error. - -If the `decodeStrings` flag is set in the constructor options, then -`chunk` may be a string rather than a Buffer, and `encoding` will -indicate the sort of string that it is. This is to support -implementations that have an optimized handling for certain string -data encodings. If you do not explicitly set the `decodeStrings` -option to `false`, then you can safely ignore the `encoding` argument, -and assume that `chunk` will always be a Buffer. - -This method is prefixed with an underscore because it is internal to -the class that defines it, and should not be called directly by user -programs. However, you **are** expected to override this method in -your own extension classes. - - -### writable.write(chunk, [encoding], [callback]) - -* `chunk` {Buffer | String} Data to be written -* `encoding` {String} Optional. If `chunk` is a string, then encoding - defaults to `'utf8'` -* `callback` {Function} Optional. Called when this chunk is - successfully written. -* Returns {Boolean} - -Writes `chunk` to the stream. Returns `true` if the data has been -flushed to the underlying resource. Returns `false` to indicate that -the buffer is full, and the data will be sent out in the future. The -`'drain'` event will indicate when the buffer is empty again. - -The specifics of when `write()` will return false, is determined by -the `highWaterMark` option provided to the constructor. - -### writable.end([chunk], [encoding], [callback]) - -* `chunk` {Buffer | String} Optional final data to be written -* `encoding` {String} Optional. If `chunk` is a string, then encoding - defaults to `'utf8'` -* `callback` {Function} Optional. Called when the final chunk is - successfully written. - -Call this method to signal the end of the data being written to the -stream. - -### Event: 'drain' - -Emitted when the stream's write queue empties and it's safe to write -without buffering again. Listen for it when `stream.write()` returns -`false`. - -### Event: 'close' - -Emitted when the underlying resource (for example, the backing file -descriptor) has been closed. Not all streams will emit this. - -### Event: 'finish' - -When `end()` is called and there are no more chunks to write, this -event is emitted. - -### Event: 'pipe' - -* `source` {Readable Stream} - -Emitted when the stream is passed to a readable stream's pipe method. - -### Event 'unpipe' - -* `source` {Readable Stream} - -Emitted when a previously established `pipe()` is removed using the -source Readable stream's `unpipe()` method. - -## Class: stream.Duplex - - - -A "duplex" stream is one that is both Readable and Writable, such as a -TCP socket connection. - -Note that `stream.Duplex` is an abstract class designed to be -extended with an underlying implementation of the `_read(size)` -and `_write(chunk, encoding, callback)` methods as you would with a Readable or -Writable stream class. - -Since JavaScript doesn't have multiple prototypal inheritance, this -class prototypally inherits from Readable, and then parasitically from -Writable. It is thus up to the user to implement both the lowlevel -`_read(n)` method as well as the lowlevel `_write(chunk, encoding, cb)` method -on extension duplex classes. - -### new stream.Duplex(options) - -* `options` {Object} Passed to both Writable and Readable - constructors. Also has the following fields: - * `allowHalfOpen` {Boolean} Default=true. If set to `false`, then - the stream will automatically end the readable side when the - writable side ends and vice versa. - -In classes that extend the Duplex class, make sure to call the -constructor so that the buffering settings can be properly -initialized. - -## Class: stream.Transform - -A "transform" stream is a duplex stream where the output is causally -connected in some way to the input, such as a zlib stream or a crypto -stream. - -There is no requirement that the output be the same size as the input, -the same number of chunks, or arrive at the same time. For example, a -Hash stream will only ever have a single chunk of output which is -provided when the input is ended. A zlib stream will either produce -much smaller or much larger than its input. - -Rather than implement the `_read()` and `_write()` methods, Transform -classes must implement the `_transform()` method, and may optionally -also implement the `_flush()` method. (See below.) - -### new stream.Transform([options]) - -* `options` {Object} Passed to both Writable and Readable - constructors. - -In classes that extend the Transform class, make sure to call the -constructor so that the buffering settings can be properly -initialized. - -### transform.\_transform(chunk, encoding, callback) - -* `chunk` {Buffer | String} The chunk to be transformed. Will always - be a buffer unless the `decodeStrings` option was set to `false`. -* `encoding` {String} If the chunk is a string, then this is the - encoding type. (Ignore if `decodeStrings` chunk is a buffer.) -* `callback` {Function} Call this function (optionally with an error - argument) when you are done processing the supplied chunk. - -Note: **This function MUST NOT be called directly.** It should be -implemented by child classes, and called by the internal Transform -class methods only. - -All Transform stream implementations must provide a `_transform` -method to accept input and produce output. - -`_transform` should do whatever has to be done in this specific -Transform class, to handle the bytes being written, and pass them off -to the readable portion of the interface. Do asynchronous I/O, -process things, and so on. - -Call `transform.push(outputChunk)` 0 or more times to generate output -from this input chunk, depending on how much data you want to output -as a result of this chunk. - -Call the callback function only when the current chunk is completely -consumed. Note that there may or may not be output as a result of any -particular input chunk. - -This method is prefixed with an underscore because it is internal to -the class that defines it, and should not be called directly by user -programs. However, you **are** expected to override this method in -your own extension classes. - -### transform.\_flush(callback) - -* `callback` {Function} Call this function (optionally with an error - argument) when you are done flushing any remaining data. - -Note: **This function MUST NOT be called directly.** It MAY be implemented -by child classes, and if so, will be called by the internal Transform -class methods only. - -In some cases, your transform operation may need to emit a bit more -data at the end of the stream. For example, a `Zlib` compression -stream will store up some internal state so that it can optimally -compress the output. At the end, however, it needs to do the best it -can with what is left, so that the data will be complete. - -In those cases, you can implement a `_flush` method, which will be -called at the very end, after all the written data is consumed, but -before emitting `end` to signal the end of the readable side. Just -like with `_transform`, call `transform.push(chunk)` zero or more -times, as appropriate, and call `callback` when the flush operation is -complete. - -This method is prefixed with an underscore because it is internal to -the class that defines it, and should not be called directly by user -programs. However, you **are** expected to override this method in -your own extension classes. - -### Example: `SimpleProtocol` parser - -The example above of a simple protocol parser can be implemented much -more simply by using the higher level `Transform` stream class. - -In this example, rather than providing the input as an argument, it -would be piped into the parser, which is a more idiomatic Node stream -approach. - -```javascript -function SimpleProtocol(options) { - if (!(this instanceof SimpleProtocol)) - return new SimpleProtocol(options); - - Transform.call(this, options); - this._inBody = false; - this._sawFirstCr = false; - this._rawHeader = []; - this.header = null; -} - -SimpleProtocol.prototype = Object.create( - Transform.prototype, { constructor: { value: SimpleProtocol }}); - -SimpleProtocol.prototype._transform = function(chunk, encoding, done) { - if (!this._inBody) { - // check if the chunk has a \n\n - var split = -1; - for (var i = 0; i < chunk.length; i++) { - if (chunk[i] === 10) { // '\n' - if (this._sawFirstCr) { - split = i; - break; - } else { - this._sawFirstCr = true; - } - } else { - this._sawFirstCr = false; - } - } - - if (split === -1) { - // still waiting for the \n\n - // stash the chunk, and try again. - this._rawHeader.push(chunk); - } else { - this._inBody = true; - var h = chunk.slice(0, split); - this._rawHeader.push(h); - var header = Buffer.concat(this._rawHeader).toString(); - try { - this.header = JSON.parse(header); - } catch (er) { - this.emit('error', new Error('invalid simple protocol data')); - return; - } - // and let them know that we are done parsing the header. - this.emit('header', this.header); - - // now, because we got some extra data, emit this first. - this.push(b); - } - } else { - // from there on, just provide the data to our consumer as-is. - this.push(b); - } - done(); -}; - -var parser = new SimpleProtocol(); -source.pipe(parser) - -// Now parser is a readable stream that will emit 'header' -// with the parsed header data. -``` - - -## Class: stream.PassThrough - -This is a trivial implementation of a `Transform` stream that simply -passes the input bytes across to the output. Its purpose is mainly -for examples and testing, but there are occasionally use cases where -it can come in handy. - - -[EventEmitter]: https://nodejs.org/api/events.html#events_class_eventemitter diff --git a/locale/fa/blog/module/index.md b/locale/fa/blog/module/index.md deleted file mode 100644 index 613857450ea6..000000000000 --- a/locale/fa/blog/module/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Modules -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/module/multi-server-continuous-deployment-with-fleet.md b/locale/fa/blog/module/multi-server-continuous-deployment-with-fleet.md deleted file mode 100644 index 5614bc10a884..000000000000 --- a/locale/fa/blog/module/multi-server-continuous-deployment-with-fleet.md +++ /dev/null @@ -1,92 +0,0 @@ ---- -title: multi-server continuous deployment with fleet -author: Isaac Schlueter -date: 2012-05-02T18:00:00.000Z -status: publish -category: module -slug: multi-server-continuous-deployment-with-fleet -layout: blog-post.hbs ---- - -

substackThis is a guest post by James "SubStack" Halliday, originally posted on his blog, and reposted here with permission.

- -

Writing applications as a sequence of tiny services that all talk to each other over the network has many upsides, but it can be annoyingly tedious to get all the subsystems up and running.

- -

Running a seaport can help with getting all the services to talk to each other, but running the processes is another matter, especially when you have new code to push into production.

- -

fleet aims to make it really easy for anyone on your team to push new code from git to an armada of servers and manage all the processes in your stack.

- -

To start using fleet, just install the fleet command with npm:

- -
npm install -g fleet 
- -

Then on one of your servers, start a fleet hub. From a fresh directory, give it a passphrase and a port to listen on:

- -
fleet hub --port=7000 --secret=beepboop 
- -

Now fleet is listening on :7000 for commands and has started a git server on :7001 over http. There's no ssh keys or post commit hooks to configure, just run that command and you're ready to go!

- -

Next set up some worker drones to run your processes. You can have as many workers as you like on a single server but each worker should be run from a separate directory. Just do:

- -
fleet drone --hub=x.x.x.x:7000 --secret=beepboop 
- -

where x.x.x.x is the address where the fleet hub is running. Spin up a few of these drones.

- -

Now navigate to the directory of the app you want to deploy. First set a remote so you don't need to type --hub and --secret all the time.

- -
fleet remote add default --hub=x.x.x.x:7000 --secret=beepboop 
- -

Fleet just created a fleet.json file for you to save your settings.

- -

From the same app directory, to deploy your code just do:

- -
fleet deploy 
- -

The deploy command does a git push to the fleet hub's git http server and then the hub instructs all the drones to pull from it. Your code gets checked out into a new directory on all the fleet drones every time you deploy.

- -

Because fleet is designed specifically for managing applications with lots of tiny services, the deploy command isn't tied to running any processes. Starting processes is up to the programmer but it's super simple. Just use the fleet spawn command:

- -
fleet spawn -- node server.js 8080 
- -

By default fleet picks a drone at random to run the process on. You can specify which drone you want to run a particular process on with the --drone switch if it matters.

- -

Start a few processes across all your worker drones and then show what is running with the fleet ps command:

- -
fleet ps
-drone#3dfe17b8
-├─┬ pid#1e99f4
-│ ├── status:   running
-│ ├── commit:   webapp/1b8050fcaf8f1b02b9175fcb422644cb67dc8cc5
-│ └── command:  node server.js 8888
-└─┬ pid#d7048a
-  ├── status:   running
-  ├── commit:   webapp/1b8050fcaf8f1b02b9175fcb422644cb67dc8cc5
-  └── command:  node server.js 8889
- -

Now suppose that you have new code to push out into production. By default, fleet lets you spin up new services without disturbing your existing services. If you fleet deploy again after checking in some new changes to git, the next time you fleet spawn a new process, that process will be spun up in a completely new directory based on the git commit hash. To stop a process, just use fleet stop.

- -

This approach lets you verify that the new services work before bringing down the old services. You can even start experimenting with heterogeneous and incremental deployment by hooking into a custom http proxy!

- -

Even better, if you use a service registry like seaport for managing the host/port tables, you can spin up new ad-hoc staging clusters all the time without disrupting the normal operation of your site before rolling out new code to users.

- -

Fleet has many more commands that you can learn about with its git-style manpage-based help system! Just do fleet help to get a list of all the commands you can run.

- -
fleet help
-Usage: fleet <command> [<args>]
-
-The commands are:
-  deploy   Push code to drones.
-  drone    Connect to a hub as a worker.
-  exec     Run commands on drones.
-  hub      Create a hub for drones to connect.
-  monitor  Show service events system-wide.
-  ps       List the running processes on the drones.
-  remote   Manage the set of remote hubs.
-  spawn    Run services on drones.
-  stop     Stop processes running on drones.
-
-For help about a command, try `fleet help `.
- -

npm install -g fleet and check out the code on github!

- - diff --git a/locale/fa/blog/module/service-logging-in-json-with-bunyan.md b/locale/fa/blog/module/service-logging-in-json-with-bunyan.md deleted file mode 100644 index 4e2692e78f74..000000000000 --- a/locale/fa/blog/module/service-logging-in-json-with-bunyan.md +++ /dev/null @@ -1,340 +0,0 @@ ---- -title: Service logging in JSON with Bunyan -author: trentmick -date: 2012-03-28T19:25:26.000Z -status: publish -category: module -slug: service-logging-in-json-with-bunyan -layout: blog-post.hbs ---- - -
-Paul Bunyan and Babe the Blue Ox
-Photo by Paul Carroll -
- -

Service logs are gold, if you can mine them. We scan them for occasional debugging. Perhaps we grep them looking for errors or warnings, or setup an occasional nagios log regex monitor. If that. This is a waste of the best channel for data about a service.

- -

"Log. (Huh) What is it good for. Absolutely ..."

- - - -

These are what logs are good for. The current state of logging is barely adequate for the first of these. Doing reliable analysis, and even monitoring, of varied "printf-style" logs is a grueling or hacky task that most either don't bother with, fallback to paying someone else to do (viz. Splunk's great successes), or, for web sites, punt and use the plethora of JavaScript-based web analytics tools.

- -

Let's log in JSON. Let's format log records with a filter outside the app. Let's put more info in log records by not shoehorning into a printf-message. Debuggability can be improved. Monitoring and analysis can definitely be improved. Let's not write another regex-based parser, and use the time we've saved writing tools to collate logs from multiple nodes and services, to query structured logs (from all services, not just web servers), etc.

- -

At Joyent we use node.js for running many core services -- loosely coupled through HTTP REST APIs and/or AMQP. In this post I'll draw on experiences from my work on Joyent's SmartDataCenter product and observations of Joyent Cloud operations to suggest some improvements to service logging. I'll show the (open source) Bunyan logging library and tool that we're developing to improve the logging toolchain.

- -

Current State of Log Formatting

- -
# apache access log
-10.0.1.22 - - [15/Oct/2010:11:46:46 -0700] "GET /favicon.ico HTTP/1.1" 404 209
-fe80::6233:4bff:fe29:3173 - - [15/Oct/2010:11:46:58 -0700] "GET / HTTP/1.1" 200 44
-
-# apache error log
-[Fri Oct 15 11:46:46 2010] [error] [client 10.0.1.22] File does not exist: /Library/WebServer/Documents/favicon.ico
-[Fri Oct 15 11:46:58 2010] [error] [client fe80::6233:4bff:fe29:3173] File does not exist: /Library/WebServer/Documents/favicon.ico
-
-# Mac /var/log/secure.log
-Oct 14 09:20:56 banana loginwindow[41]: in pam_sm_authenticate(): Failed to determine Kerberos principal name.
-Oct 14 12:32:20 banana com.apple.SecurityServer[25]: UID 501 authenticated as user trentm (UID 501) for right 'system.privilege.admin'
-
-# an internal joyent agent log
-[2012-02-07 00:37:11.898] [INFO] AMQPAgent - Publishing success.
-[2012-02-07 00:37:11.910] [DEBUG] AMQPAgent - { req_id: '8afb8d99-df8e-4724-8535-3d52adaebf25',
-  timestamp: '2012-02-07T00:37:11.898Z',
-
-# typical expressjs log output
-[Mon, 21 Nov 2011 20:52:11 GMT] 200 GET /foo (1ms)
-Blah, some other unstructured output to from a console.log call.
-
- -

What're we doing here? Five logs at random. Five different date formats. As Paul Querna points out we haven't improved log parsability in 20 years. Parsability is enemy number one. You can't use your logs until you can parse the records, and faced with the above the inevitable solution is a one-off regular expression.

- -

The current state of the art is various parsing libs, analysis tools and homebrew scripts ranging from grep to Perl, whose scope is limited to a few niches log formats.

- -

JSON for Logs

- -

JSON.parse() solves all that. Let's log in JSON. But it means a change in thinking: The first-level audience for log files shouldn't be a person, but a machine.

- -

That is not said lightly. The "Unix Way" of small focused tools lightly coupled with text output is important. JSON is less "text-y" than, e.g., Apache common log format. JSON makes grep and awk awkward. Using less directly on a log is handy.

- -

But not handy enough. That 80's pastel jumpsuit awkwardness you're feeling isn't the JSON, it's your tools. Time to find a json tool -- json is one, bunyan described below is another one. Time to learn your JSON library instead of your regex library: JavaScript, Python, Ruby, Java, Perl.

- -

Time to burn your log4j Layout classes and move formatting to the tools side. Creating a log message with semantic information and throwing that away to make a string is silly. The win at being able to trivially parse log records is huge. The possibilities at being able to add ad hoc structured information to individual log records is interesting: think program state metrics, think feeding to Splunk, or loggly, think easy audit logs.

- -

Introducing Bunyan

- -

Bunyan is a node.js module for logging in JSON and a bunyan CLI tool to view those logs.

- -

Logging with Bunyan basically looks like this:

- -
$ cat hi.js
-var Logger = require('bunyan');
-var log = new Logger({name: 'hello' /*, ... */});
-log.info("hi %s", "paul");
-
- -

And you'll get a log record like this:

- -
$ node hi.js
-{"name":"hello","hostname":"banana.local","pid":40026,"level":30,"msg":"hi paul","time":"2012-03-28T17:25:37.050Z","v":0}
-
- -

Pipe that through the bunyan tool that is part of the "node-bunyan" install to get more readable output:

- -
$ node hi.js | ./node_modules/.bin/bunyan       # formatted text output
-[2012-02-07T18:50:18.003Z]  INFO: hello/40026 on banana.local: hi paul
-
-$ node hi.js | ./node_modules/.bin/bunyan -j    # indented JSON output
-{
-  "name": "hello",
-  "hostname": "banana.local",
-  "pid": 40087,
-  "level": 30,
-  "msg": "hi paul",
-  "time": "2012-03-28T17:26:38.431Z",
-  "v": 0
-}
-
- -

Bunyan is log4j-like: create a Logger with a name, call log.info(...), etc. However it has no intention of reproducing much of the functionality of log4j. IMO, much of that is overkill for the types of services you'll tend to be writing with node.js.

- -

Longer Bunyan Example

- -

Let's walk through a bigger example to show some interesting things in Bunyan. We'll create a very small "Hello API" server using the excellent restify library -- which we used heavily here at Joyent. (Bunyan doesn't require restify at all, you can easily use Bunyan with Express or whatever.)

- -

You can follow along in https://github.com/trentm/hello-json-logging if you like. Note that I'm using the current HEAD of the bunyan and restify trees here, so details might change a bit. Prerequisite: a node 0.6.x installation.

- -
git clone https://github.com/trentm/hello-json-logging.git
-cd hello-json-logging
-make
-
- -

Bunyan Logger

- -

Our server first creates a Bunyan logger:

- -
var Logger = require('bunyan');
-var log = new Logger({
-  name: 'helloapi',
-  streams: [
-    {
-      stream: process.stdout,
-      level: 'debug'
-    },
-    {
-      path: 'hello.log',
-      level: 'trace'
-    }
-  ],
-  serializers: {
-    req: Logger.stdSerializers.req,
-    res: restify.bunyan.serializers.response,
-  },
-});
-
- -

Every Bunyan logger must have a name. Unlike log4j, this is not a hierarchical dotted namespace. It is just a name field for the log records.

- -

Every Bunyan logger has one or more streams, to which log records are written. Here we've defined two: logging at DEBUG level and above is written to stdout, and logging at TRACE and above is appended to 'hello.log'.

- -

Bunyan has the concept of serializers: a registry of functions that know how to convert a JavaScript object for a certain log record field to a nice JSON representation for logging. For example, here we register the Logger.stdSerializers.req function to convert HTTP Request objects (using the field name "req") to JSON. More on serializers later.

- -

Restify Server

- -

Restify 1.x and above has bunyan support baked in. You pass in your Bunyan logger like this:

- -
var server = restify.createServer({
-  name: 'Hello API',
-  log: log   // Pass our logger to restify.
-});
-
- -

Our simple API will have a single GET /hello?name=NAME endpoint:

- -
server.get({path: '/hello', name: 'SayHello'}, function(req, res, next) {
-  var caller = req.params.name || 'caller';
-  req.log.debug('caller is "%s"', caller);
-  res.send({"hello": caller});
-  return next();
-});
-
- -

If we run that, node server.js, and call the endpoint, we get the expected restify response:

- -
$ curl -iSs http://0.0.0.0:8080/hello?name=paul
-HTTP/1.1 200 OK
-Access-Control-Allow-Origin: *
-Access-Control-Allow-Headers: Accept, Accept-Version, Content-Length, Content-MD5, Content-Type, Date, X-Api-Version
-Access-Control-Expose-Headers: X-Api-Version, X-Request-Id, X-Response-Time
-Server: Hello API
-X-Request-Id: f6aaf942-c60d-4c72-8ddd-bada459db5e3
-Access-Control-Allow-Methods: GET
-Connection: close
-Content-Length: 16
-Content-MD5: Xmn3QcFXaIaKw9RPUARGBA==
-Content-Type: application/json
-Date: Tue, 07 Feb 2012 19:12:35 GMT
-X-Response-Time: 4
-
-{"hello":"paul"}
-
- -

Setup Server Logging

- -

Let's add two things to our server. First, we'll use the server.pre to hook into restify's request handling before routing where we'll log the request.

- -
server.pre(function (request, response, next) {
-  request.log.info({req: request}, 'start');        // (1)
-  return next();
-});
-
- -

This is the first time we've seen this log.info style with an object as the first argument. Bunyan logging methods (log.trace, log.debug, ...) all support an optional first object argument with extra log record fields:

- -
log.info(<object> fields, <string> msg, ...)
-
- -

Here we pass in the restify Request object, req. The "req" serializer we registered above will come into play here, but bear with me.

- -

Remember that we already had this debug log statement in our endpoint handler:

- -
req.log.debug('caller is "%s"', caller);            // (2)
-
- -

Second, use the restify server after event to log the response:

- -
server.on('after', function (req, res, route) {
-  req.log.info({res: res}, "finished");             // (3)
-});
-
- -

Log Output

- -

Now lets see what log output we get when somebody hits our API's endpoint:

- -
$ curl -iSs http://0.0.0.0:8080/hello?name=paul
-HTTP/1.1 200 OK
-...
-X-Request-Id: 9496dfdd-4ec7-4b59-aae7-3fed57aed5ba
-...
-
-{"hello":"paul"}
-
- -

Here is the server log:

- -
[trentm@banana:~/tm/hello-json-logging]$ node server.js
-... intro "listening at" log message elided ...
-{"name":"helloapi","hostname":"banana.local","pid":40341,"level":30,"req":{"method":"GET","url":"/hello?name=paul","headers":{"user-agent":"curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3","host":"0.0.0.0:8080","accept":"*/*"},"remoteAddress":"127.0.0.1","remotePort":59831},"msg":"start","time":"2012-03-28T17:37:29.506Z","v":0}
-{"name":"helloapi","hostname":"banana.local","pid":40341,"route":"SayHello","req_id":"9496dfdd-4ec7-4b59-aae7-3fed57aed5ba","level":20,"msg":"caller is \"paul\"","time":"2012-03-28T17:37:29.507Z","v":0}
-{"name":"helloapi","hostname":"banana.local","pid":40341,"route":"SayHello","req_id":"9496dfdd-4ec7-4b59-aae7-3fed57aed5ba","level":30,"res":{"statusCode":200,"headers":{"access-control-allow-origin":"*","access-control-allow-headers":"Accept, Accept-Version, Content-Length, Content-MD5, Content-Type, Date, X-Api-Version","access-control-expose-headers":"X-Api-Version, X-Request-Id, X-Response-Time","server":"Hello API","x-request-id":"9496dfdd-4ec7-4b59-aae7-3fed57aed5ba","access-control-allow-methods":"GET","connection":"close","content-length":16,"content-md5":"Xmn3QcFXaIaKw9RPUARGBA==","content-type":"application/json","date":"Wed, 28 Mar 2012 17:37:29 GMT","x-response-time":3}},"msg":"finished","time":"2012-03-28T17:37:29.510Z","v":0}
-
- -

Lets look at each in turn to see what is interesting -- pretty-printed with node server.js | ./node_modules/.bin/bunyan -j:

- -
{                                                   // (1)
-  "name": "helloapi",
-  "hostname": "banana.local",
-  "pid": 40442,
-  "level": 30,
-  "req": {
-    "method": "GET",
-    "url": "/hello?name=paul",
-    "headers": {
-      "user-agent": "curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3",
-      "host": "0.0.0.0:8080",
-      "accept": "*/*"
-    },
-    "remoteAddress": "127.0.0.1",
-    "remotePort": 59834
-  },
-  "msg": "start",
-  "time": "2012-03-28T17:39:44.880Z",
-  "v": 0
-}
-
- -

Here we logged the incoming request with request.log.info({req: request}, 'start'). The use of the "req" field triggers the "req" serializer registered at Logger creation.

- -

Next the req.log.debug in our handler:

- -
{                                                   // (2)
-  "name": "helloapi",
-  "hostname": "banana.local",
-  "pid": 40442,
-  "route": "SayHello",
-  "req_id": "9496dfdd-4ec7-4b59-aae7-3fed57aed5ba",
-  "level": 20,
-  "msg": "caller is \"paul\"",
-  "time": "2012-03-28T17:39:44.883Z",
-  "v": 0
-}
-
- -

and the log of response in the "after" event:

- -
{                                                   // (3)
-  "name": "helloapi",
-  "hostname": "banana.local",
-  "pid": 40442,
-  "route": "SayHello",
-  "req_id": "9496dfdd-4ec7-4b59-aae7-3fed57aed5ba",
-  "level": 30,
-  "res": {
-    "statusCode": 200,
-    "headers": {
-      "access-control-allow-origin": "*",
-      "access-control-allow-headers": "Accept, Accept-Version, Content-Length, Content-MD5, Content-Type, Date, X-Api-Version",
-      "access-control-expose-headers": "X-Api-Version, X-Request-Id, X-Response-Time",
-      "server": "Hello API",
-      "x-request-id": "9496dfdd-4ec7-4b59-aae7-3fed57aed5ba",
-      "access-control-allow-methods": "GET",
-      "connection": "close",
-      "content-length": 16,
-      "content-md5": "Xmn3QcFXaIaKw9RPUARGBA==",
-      "content-type": "application/json",
-      "date": "Wed, 28 Mar 2012 17:39:44 GMT",
-      "x-response-time": 5
-    }
-  },
-  "msg": "finished",
-  "time": "2012-03-28T17:39:44.886Z",
-  "v": 0
-}
-
- -

Two useful details of note here:

- -
    -
  1. The last two log messages include a "req_id" field (added to the req.log logger by restify). Note that this is the same UUID as the "X-Request-Id" header in the curl response. This means that if you use req.log for logging in your API handlers you will get an easy way to collate all logging for particular requests.

    - -

    If your's is an SOA system with many services, a best practice is to carry that X-Request-Id/req_id through your system to enable collating handling of a single top-level request.

  2. -
  3. The last two log messages include a "route" field. This tells you to which handler restify routed the request. While possibly useful for debugging, this can be very helpful for log-based monitoring of endpoints on a server.

  4. -
- -

Recall that we also setup all logging to go the "hello.log" file. This was set at the TRACE level. Restify will log more detail of its operation at the trace level. See my "hello.log" for an example. The bunyan tool does a decent job of nicely formatting multiline messages and "req"/"res" keys (with color, not shown in the gist).

- -

This is logging you can use effectively.

- -

Other Tools

- -

Bunyan is just one of many options for logging in node.js-land. Others (that I know of) supporting JSON logging are winston and logmagic. Paul Querna has an excellent post on using JSON for logging, which shows logmagic usage and also touches on topics like the GELF logging format, log transporting, indexing and searching.

- -

Final Thoughts

- -

Parsing challenges won't ever completely go away, but it can for your logs if you use JSON. Collating log records across logs from multiple nodes is facilitated by a common "time" field. Correlating logging across multiple services is enabled by carrying a common "req_id" (or equivalent) through all such logs.

- -

Separate log files for a single service is an anti-pattern. The typical Apache example of separate access and error logs is legacy, not an example to follow. A JSON log provides the structure necessary for tooling to easily filter for log records of a particular type.

- -

JSON logs bring possibilities. Feeding to tools like Splunk becomes easy. Ad hoc fields allow for a lightly spec'd comm channel from apps to other services: records with a "metric" could feed to statsd, records with a "loggly: true" could feed to loggly.com.

- -

Here I've described a very simple example of restify and bunyan usage for node.js-based API services with easy JSON logging. Restify provides a powerful framework for robust API services. Bunyan provides a light API for nice JSON logging and the beginnings of tooling to help consume Bunyan JSON logs.

- -

Update (29-Mar-2012): Fix styles somewhat for RSS readers.

diff --git a/locale/fa/blog/npm/2013-outage-postmortem.md b/locale/fa/blog/npm/2013-outage-postmortem.md deleted file mode 100644 index 01c2cb5238d4..000000000000 --- a/locale/fa/blog/npm/2013-outage-postmortem.md +++ /dev/null @@ -1,86 +0,0 @@ ---- -date: 2013-11-26T15:14:59.000Z -author: Charlie Robbins -title: Keeping The npm Registry Awesome -slug: npm-post-mortem -category: npm -layout: blog-post.hbs ---- - -We know the availability and overall health of The npm Registry is paramount to everyone using Node.js as well as the larger JavaScript community and those of your using it for [some][browserify] [awesome][dotc] [projects][npm-rubygems] [and ideas][npm-python]. Between November 4th and November 15th 2013 The npm Registry had several hours of downtime over three distinct time periods: - -1. November 4th -- 16:30 to 15:00 UTC -2. November 13th -- 15:00 to 19:30 UTC -3. November 15th -- 15:30 to 18:00 UTC - -The root cause of these downtime was insufficient resources: both hardware and human. This is a full post-mortem where we will be look at how npmjs.org works, what went wrong, how we changed the previous architecture of The npm Registry to fix it, as well next steps we are taking to prevent this from happening again. - -All of the next steps require additional expenditure from Nodejitsu: both servers and labor. This is why along with this post-mortem we are announcing our [crowdfunding campaign: scalenpm.org](https://scalenpm.org)! Our goal is to raise enough funds so that Nodejitsu can continue to run The npm Registry as a free service for _you, the community._ - -Please take a minute now to donate at [https://scalenpm.org](https://scalenpm.org)! - -## How does npmjs.org work? - -There are two distinct components that make up npmjs.org operated by different people: - -* **http://registry.npmjs.org**: The main CouchApp (Github: [isaacs/npmjs.org](https://github.com/isaacs/npmjs.org)) that stores both package tarballs and metadata. It is operated by Nodejitsu since we [acquired IrisCouch in May](https://www.nodejitsu.com/company/press/2013/05/22/iriscouch/). The primary system administrator is [Jason Smith](https://github.com/jhs), the current CTO at Nodejitsu, cofounder of IrisCouch, and the System Administrator of registry.npmjs.org since 2011. -* **https://npmjs.com**: The npmjs website that you interact with using a web browser. It is a Node.js program (Github: [isaacs/npm-www](https://github.com/isaacs/npm-www)) maintained and operated by Isaac and running on a Joyent Public Cloud SmartMachine. - -Here is a high-level summary of the _old architecture:_ - -
- old npm architecture -
Diagram 1. Old npm architecture
-
- -## What went wrong and how was it fixed? - -As illustrated above, before November 13th, 2013, npm operated as a single CouchDB server with regular daily backups. We briefly ran a multi-master CouchDB setup after downtime back in August, but after reports that `npm login` no longer worked correctly we rolled back to a single CouchDB server. On both November 13th and November 15th CouchDB became unresponsive on requests to the `/registry` database while requests to all other databases (e.g. `/public_users`) remained responsive. Although the root cause of the CouchDB failures have yet to be determined given that only requests to `/registry` were slow and/or timed out we suspect it is related to the massive number of attachments stored in the registry. - -The incident on November 4th was ultimately resolved by a reboot and resize of the host machine, but when the same symptoms reoccured less than 10 days later additional steps were taken: - -1. The [registry was moved to another machine][ops-new-machine] of equal resources to exclude the possibility of a hardware issue. -2. The [registry database itself][ops-compaction] was [compacted][compaction]. - -When neither of these yielded a solution Jason Smith and I decided to move to a multi-master architecture with continuous replication illustrated below: - -
- current npm architecture -
Diagram 2. Current npm architecture -- Red-lines denote continuous replication
-
- -This _should_ have been the end of our story but unfortunately our supervision logic did not function properly to restart the secondary master on the morning of November 15th. During this time we [moved briefly][ops-single-server] back to a single master architecture. Since then the secondary master has been closely monitored by the entire Nodejitsu operations team to ensure it's continued stability. - -## What is being done to prevent future incidents? - -The public npm registry simply cannot go down. **Ever.** We gained a lot of operational knowledge about The npm Registry and about CouchDB as a result of these outages. This new knowledge has made clear several steps that we need to take to prevent future downtime: - -1. **Always be in multi-master**: The multi-master CouchDB architecture we have setup will scale to more than just two CouchDB servers. _As npm grows we'll be able to add additional capacity!_ -2. **Decouple www.npmjs.org and registry.npmjs.org**: Right now www.npmjs.org still depends directly on registry.npmjs.org. We are planning to add an additional replica to the current npm architecture so that Isaac can more easily service requests to www.npmjs.org. That means it won't go down if the registry goes down. -3. **Always have a spare replica**: We need have a hot spare replica running continuous replication from either to swap out when necessary. This is also important as we need to regularly run compaction on each master since the registry is growing ~10GB per week on disk. -4. **Move attachments out of CouchDB**: Work has begun to move the package tarballs out of CouchDB and into [Joyent's Manta service](http://www.joyent.com/products/manta). Additionally, [MaxCDN](http://www.maxcdn.com/) has generously offered to provide CDN services for npm, once the tarballs are moved out of the registry database. This will help improve delivery speed, while dramatically reducing the file system I/O load on the CouchDB servers. Work is progressing slowly, because at each stage in the plan, we are making sure that current replication users are minimally impacted. - -When these new infrastructure components are in-place The npm Registry will look like this: - -
- planned npm architecture -
- Diagram 3. Planned npm architecture -- Red-lines denote continuous replication
-
- -## You are npm! And we need your help! - -The npm Registry has had a 10x year. In November 2012 there were 13.5 million downloads. In October 2013 there were **114.6 million package downloads.** We're honored to have been a part of sustaining this growth for the community and we want to see it continue to grow to a billion package downloads a month and beyond. - -_**But we need your help!**_ All of these necessary improvements require more servers, more time from Nodejitsu staff and an overall increase to what we spend maintaining the public npm registry as a free service for the Node.js community. - -Please take a minute now to donate at [https://scalenpm.org](https://scalenpm.org)! - -[browserify]: http://browserify.org/ -[dotc]: https://github.com/substack/dotc -[npm-rubygems]: http://andrew.ghost.io/emulating-node-js-modules-in-ruby/ -[npm-python]: https://twitter.com/__lucas/status/391688082573258753 -[ops-new-machine]: https://twitter.com/npmjs/status/400692071377276928 -[ops-compaction]: https://twitter.com/npmjs/status/400705715846643712 -[compaction]: http://wiki.apache.org/couchdb/Compaction -[ops-single-server]: https://twitter.com/npmjs/status/401384681507016704 diff --git a/locale/fa/blog/npm/index.md b/locale/fa/blog/npm/index.md deleted file mode 100644 index a26081ad303e..000000000000 --- a/locale/fa/blog/npm/index.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: NPM -layout: category-index.hbs -listing: true -robots: noindex, follow ---- diff --git a/locale/fa/blog/npm/managing-node-js-dependencies-with-shrinkwrap.md b/locale/fa/blog/npm/managing-node-js-dependencies-with-shrinkwrap.md deleted file mode 100644 index 7244576e14c7..000000000000 --- a/locale/fa/blog/npm/managing-node-js-dependencies-with-shrinkwrap.md +++ /dev/null @@ -1,174 +0,0 @@ ---- -title: Managing Node.js Dependencies with Shrinkwrap -author: Dave Pacheco -date: 2012-02-27T18:51:59.000Z -status: publish -category: npm -slug: managing-node-js-dependencies-with-shrinkwrap -layout: blog-post.hbs ---- - -


-Photo by Luc Viatour (flickr)

- -

Managing dependencies is a fundamental problem in building complex software. The terrific success of github and npm have made code reuse especially easy in the Node world, where packages don't exist in isolation but rather as nodes in a large graph. The software is constantly changing (releasing new versions), and each package has its own constraints about what other packages it requires to run (dependencies). npm keeps track of these constraints, and authors express what kind of changes are compatible using semantic versioning, allowing authors to specify that their package will work with even future versions of its dependencies as long as the semantic versions are assigned properly. - -

-

This does mean that when you "npm install" a package with dependencies, there's no guarantee that you'll get the same set of code now that you would have gotten an hour ago, or that you would get if you were to run it again an hour later. You may get a bunch of bug fixes now that weren't available an hour ago. This is great during development, where you want to keep up with changes upstream. It's not necessarily what you want for deployment, though, where you want to validate whatever bits you're actually shipping. - -

-

Put differently, it's understood that all software changes incur some risk, and it's critical to be able to manage this risk on your own terms. Taking that risk in development is good because by definition that's when you're incorporating and testing software changes. On the other hand, if you're shipping production software, you probably don't want to take this risk when cutting a release candidate (i.e. build time) or when you actually ship (i.e. deploy time) because you want to validate whatever you ship. - -

-

You can address a simple case of this problem by only depending on specific versions of packages, allowing no semver flexibility at all, but this falls apart when you depend on packages that don't also adopt the same principle. Many of us at Joyent started wondering: can we generalize this approach? - -

-

Shrinkwrapping packages

-

That brings us to npm shrinkwrap[1]: - -

- -``` -NAME - npm-shrinkwrap -- Lock down dependency versions - -SYNOPSIS - npm shrinkwrap - -DESCRIPTION - This command locks down the versions of a package's dependencies so - that you can control exactly which versions of each dependency will - be used when your package is installed. -``` - -

Let's consider package A: - -

-
{
-    "name": "A",
-    "version": "0.1.0",
-    "dependencies": {
-        "B": "<0.1.0"
-    }
-}
-

package B: - -

-
{
-    "name": "B",
-    "version": "0.0.1",
-    "dependencies": {
-        "C": "<0.1.0"
-    }
-}
-

and package C: - -

-
{
-    "name": "C,
-    "version": "0.0.1"
-}
-

If these are the only versions of A, B, and C available in the registry, then a normal "npm install A" will install: - -

-
A@0.1.0
-└─┬ B@0.0.1
-  └── C@0.0.1
-

Then if B@0.0.2 is published, then a fresh "npm install A" will install: - -

-
A@0.1.0
-└─┬ B@0.0.2
-  └── C@0.0.1
-

assuming the new version did not modify B's dependencies. Of course, the new version of B could include a new version of C and any number of new dependencies. As we said before, if A's author doesn't want that, she could specify a dependency on B@0.0.1. But if A's author and B's author are not the same person, there's no way for A's author to say that she does not want to pull in newly published versions of C when B hasn't changed at all. - -

-

In this case, A's author can use - -

-
# npm shrinkwrap
-

This generates npm-shrinkwrap.json, which will look something like this: - -

-
{
-    "name": "A",
-    "dependencies": {
-        "B": {
-            "version": "0.0.1",
-            "dependencies": {
-                "C": {  "version": "0.1.0" }
-            }
-        }
-    }
-}
-

The shrinkwrap command has locked down the dependencies based on what's currently installed in node_modules. When "npm install" installs a package with a npm-shrinkwrap.json file in the package root, the shrinkwrap file (rather than package.json files) completely drives the installation of that package and all of its dependencies (recursively). So now the author publishes A@0.1.0, and subsequent installs of this package will use B@0.0.1 and C@0.1.0, regardless the dependencies and versions listed in A's, B's, and C's package.json files. If the authors of B and C publish new versions, they won't be used to install A because the shrinkwrap refers to older versions. Even if you generate a new shrinkwrap, it will still reference the older versions, since "npm shrinkwrap" uses what's installed locally rather than what's available in the registry. - -

-

Using shrinkwrapped packages

-

Using a shrinkwrapped package is no different than using any other package: you can "npm install" it by hand, or add a dependency to your package.json file and "npm install" it. - -

-

Building shrinkwrapped packages

-

To shrinkwrap an existing package: - -

-
    -
  1. Run "npm install" in the package root to install the current versions of all dependencies.
  2. -
  3. Validate that the package works as expected with these versions.
  4. -
  5. Run "npm shrinkwrap", add npm-shrinkwrap.json to git, and publish your package.
  6. -
-

To add or update a dependency in a shrinkwrapped package: - -

-
    -
  1. Run "npm install" in the package root to install the current versions of all dependencies.
  2. -
  3. Add or update dependencies. "npm install" each new or updated package individually and then update package.json.
  4. -
  5. Validate that the package works as expected with the new dependencies.
  6. -
  7. Run "npm shrinkwrap", commit the new npm-shrinkwrap.json, and publish your package.
  8. -
-

You can still use npm outdated(1) to view which dependencies have newer versions available. - -

-

For more details, check out the full docs on npm shrinkwrap, from which much of the above is taken. - -

-

Why not just check node_modules into git?

-

One previously proposed solution is to "npm install" your dependencies during development and commit the results into source control. Then you deploy your app from a specific git SHA knowing you've got exactly the same bits that you tested in development. This does address the problem, but it has its own issues: for one, binaries are tricky because you need to "npm install" them to get their sources, but this builds the [system-dependent] binary too. You can avoid checking in the binaries and use "npm rebuild" at build time, but we've had a lot of difficulty trying to do this.[2] At best, this is second-class treatment for binary modules, which are critical for many important types of Node applications.[3] - -

-

Besides the issues with binary modules, this approach just felt wrong to many of us. There's a reason we don't check binaries into source control, and it's not just because they're platform-dependent. (After all, we could build and check in binaries for all supported platforms and operating systems.) It's because that approach is error-prone and redundant: error-prone because it introduces a new human failure mode where someone checks in a source change but doesn't regenerate all the binaries, and redundant because the binaries can always be built from the sources alone. An important principle of software version control is that you don't check in files derived directly from other files by a simple transformation.[4] Instead, you check in the original sources and automate the transformations via the build process. - -

-

Dependencies are just like binaries in this regard: they're files derived from a simple transformation of something else that is (or could easily be) already available: the name and version of the dependency. Checking them in has all the same problems as checking in binaries: people could update package.json without updating the checked-in module (or vice versa). Besides that, adding new dependencies has to be done by hand, introducing more opportunities for error (checking in the wrong files, not checking in certain files, inadvertently changing files, and so on). Our feeling was: why check in this whole dependency tree (and create a mess for binary add-ons) when we could just check in the package name and version and have the build process do the rest? - -

-

Finally, the approach of checking in node_modules doesn't really scale for us. We've got at least a dozen repos that will use restify, and it doesn't make sense to check that in everywhere when we could instead just specify which version each one is using. There's another principle at work here, which is separation of concerns: each repo specifies what it needs, while the build process figures out where to get it. - -

-

What if an author republishes an existing version of a package?

-

We're not suggesting deploying a shrinkwrapped package directly and running "npm install" to install from shrinkwrap in production. We already have a build process to deal with binary modules and other automateable tasks. That's where we do the "npm install". We tar up the result and distribute the tarball. Since we test each build before shipping, we won't deploy something we didn't test. - -

-

It's still possible to pick up newly published versions of existing packages at build time. We assume force publish is not that common in the first place, let alone force publish that breaks compatibility. If you're worried about this, you can use git SHAs in the shrinkwrap or even consider maintaining a mirror of the part of the npm registry that you use and require human confirmation before mirroring unpublishes. - -

-

Final thoughts

-

Of course, the details of each use case matter a lot, and the world doesn't have to pick just one solution. If you like checking in node_modules, you should keep doing that. We've chosen the shrinkwrap route because that works better for us. - -

-

It's not exactly news that Joyent is heavy on Node. Node is the heart of our SmartDataCenter (SDC) product, whose public-facing web portal, public API, Cloud Analytics, provisioning, billing, heartbeating, and other services are all implemented in Node. That's why it's so important to us to have robust components (like logging and REST) and tools for understanding production failures postmortem, profile Node apps in production, and now managing Node dependencies. Again, we're interested to hear feedback from others using these tools. - -

-
-Dave Pacheco blogs at dtrace.org. - -

[1] Much of this section is taken directly from the "npm shrinkwrap" documentation. - -

-

[2] We've had a lot of trouble with checking in node_modules with binary dependencies. The first problem is figuring out exactly which files not to check in (.o, .node, .dynlib, .so, *.a, ...). When Mark went to apply this to one of our internal services, the "npm rebuild" step blew away half of the dependency tree because it ran "make clean", which in dependency ldapjs brings the repo to a clean slate by blowing away its dependencies. Later, a new (but highly experienced) engineer on our team was tasked with fixing a bug in our Node-based DHCP server. To fix the bug, we went with a new dependency. He tried checking in node_modules, which added 190,000 lines of code (to this repo that was previously a few hundred LOC). And despite doing everything he could think of to do this correctly and test it properly, the change broke the build because of the binary modules. So having tried this approach a few times now, it appears quite difficult to get right, and as I pointed out above, the lack of actual documentation and real world examples suggests others either aren't using binary modules (which we know isn't true) or haven't had much better luck with this approach. - -

-

[3] Like a good Node-based distributed system, our architecture uses lots of small HTTP servers. Each of these serves a REST API using restify. restify uses the binary module node-dtrace-provider, which gives each of our services deep DTrace-based observability for free. So literally almost all of our components are or will soon be depending on a binary add-on. Additionally, the foundation of Cloud Analytics are a pair of binary modules that extract data from DTrace and kstat. So this isn't a corner case for us, and we don't believe we're exceptional in this regard. The popular hiredis package for interfacing with redis from Node is also a binary module. - -

-

[4] Note that I said this is an important principle for software version control, not using git in general. People use git for lots of things where checking in binaries and other derived files is probably fine. Also, I'm not interested in proselytizing; if you want to do this for software version control too, go ahead. But don't do it out of ignorance of existing successful software engineering practices.

diff --git a/locale/fa/blog/npm/npm-1-0-global-vs-local-installation.md b/locale/fa/blog/npm/npm-1-0-global-vs-local-installation.md deleted file mode 100644 index 380eb5f48601..000000000000 --- a/locale/fa/blog/npm/npm-1-0-global-vs-local-installation.md +++ /dev/null @@ -1,67 +0,0 @@ ---- -title: "npm 1.0: Global vs Local installation" -author: Isaac Schlueter -date: 2011-03-24T06:07:13.000Z -status: publish -category: npm -slug: npm-1-0-global-vs-local-installation -layout: blog-post.hbs ---- - -

npm 1.0 is in release candidate mode. Go get it!

- -

More than anything else, the driving force behind the npm 1.0 rearchitecture was the desire to simplify what a package installation directory structure looks like.

- -

In npm 0.x, there was a command called bundle that a lot of people liked. bundle let you install your dependencies locally in your project, but even still, it was basically a hack that never really worked very reliably.

- -

Also, there was that activation/deactivation thing. That’s confusing.

- -

Two paths

- -

In npm 1.0, there are two ways to install things:

- -
  1. globally —- This drops modules in {prefix}/lib/node_modules, and puts executable files in {prefix}/bin, where {prefix} is usually something like /usr/local. It also installs man pages in {prefix}/share/man, if they’re supplied.
  2. locally —- This installs your package in the current working directory. Node modules go in ./node_modules, executables go in ./node_modules/.bin/, and man pages aren’t installed at all.
- -

Which to choose

- -

Whether to install a package globally or locally depends on the global config, which is aliased to the -g command line switch.

- -

Just like how global variables are kind of gross, but also necessary in some cases, global packages are important, but best avoided if not needed.

- -

In general, the rule of thumb is:

- -
  1. If you’re installing something that you want to use in your program, using require('whatever'), then install it locally, at the root of your project.
  2. If you’re installing something that you want to use in your shell, on the command line or something, install it globally, so that its binaries end up in your PATH environment variable.
- -

When you can't choose

- -

Of course, there are some cases where you want to do both. Coffee-script and Express both are good examples of apps that have a command line interface, as well as a library. In those cases, you can do one of the following:

- -
  1. Install it in both places. Seriously, are you that short on disk space? It’s fine, really. They’re tiny JavaScript programs.
  2. Install it globally, and then npm link coffee-script or npm link express (if you’re on a platform that supports symbolic links.) Then you only need to update the global copy to update all the symlinks as well.
- -

The first option is the best in my opinion. Simple, clear, explicit. The second is really handy if you are going to re-use the same library in a bunch of different projects. (More on npm link in a future installment.)

- -

You can probably think of other ways to do it by messing with environment variables. But I don’t recommend those ways. Go with the grain.

- -

Slight exception: It’s not always the cwd.

- -

Let’s say you do something like this:

- -
cd ~/projects/foo     # go into my project
-npm install express   # ./node_modules/express
-cd lib/utils          # move around in there
-vim some-thing.js     # edit some stuff, work work work
-npm install redis     # ./lib/utils/node_modules/redis!? ew.
- -

In this case, npm will install redis into ~/projects/foo/node_modules/redis. Sort of like how git will work anywhere within a git repository, npm will work anywhere within a package, defined by having a node_modules folder.

- -

Test runners and stuff

- -

If your package's scripts.test command uses a command-line program installed by one of your dependencies, not to worry. npm makes ./node_modules/.bin the first entry in the PATH environment variable when running any lifecycle scripts, so this will work fine, even if your program is not globally installed: - -

{ "name" : "my-program"
-, "version" : "1.2.3"
-, "dependencies": { "express": "*", "coffee-script": "*" }
-, "devDependencies": { "vows": "*" }
-, "scripts":
-  { "test": "vows test/*.js"
-  , "preinstall": "cake build" } }
diff --git a/locale/fa/blog/npm/npm-1-0-link.md b/locale/fa/blog/npm/npm-1-0-link.md deleted file mode 100644 index d8fd1304742f..000000000000 --- a/locale/fa/blog/npm/npm-1-0-link.md +++ /dev/null @@ -1,117 +0,0 @@ ---- -title: "npm 1.0: link" -author: Isaac Schlueter -date: 2011-04-07T00:40:33.000Z -status: publish -category: npm -slug: npm-1-0-link -layout: blog-post.hbs ---- - -

npm 1.0 is in release candidate mode. Go get it!

- -

In npm 0.x, there was a command called link. With it, you could “link-install” a package so that changes would be reflected in real-time. This is especially handy when you’re actually building something. You could make a few changes, run the command again, and voila, your new code would be run without having to re-install every time.

- -

Of course, compiled modules still have to be rebuilt. That’s not ideal, but it’s a problem that will take more powerful magic to solve.

- -

In npm 0.x, this was a pretty awful kludge. Back then, every package existed in some folder like:

- -
prefix/lib/node/.npm/my-package/1.3.6/package
-
- -

and the package’s version and name could be inferred from the path. Then, symbolic links were set up that looked like:

- -
prefix/lib/node/my-package@1.3.6 -> ./.npm/my-package/1.3.6/package
-
- -

It was easy enough to point that symlink to a different location. However, since the package.json file could change, that meant that the connection between the version and the folder was not reliable.

- -

At first, this was just sort of something that we dealt with by saying, “Relink if you change the version.” However, as more and more edge cases arose, eventually the solution was to give link packages this fakey version of “9999.0.0-LINK-hash” so that npm knew it was an impostor. Sometimes the package was treated as if it had the 9999.0.0 version, and other times it was treated as if it had the version specified in the package.json.

- -

A better way

- -

For npm 1.0, we backed up and looked at what the actual use cases were. Most of the time when you link something you want one of the following:

- -
    -
  1. globally install this package I’m working on so that I can run the command it creates and test its stuff as I work on it.
  2. -
  3. locally install my thing into some other thing that depends on it, so that the other thing can require() it.
  4. -
- -

And, in both cases, changes should be immediately apparent and not require any re-linking.

- -

Also, there’s a third use case that I didn’t really appreciate until I started writing more programs that had more dependencies:

- -
  1. Globally install something, and use it in development in a bunch of projects, and then update them all at once so that they all use the latest version.

- -

Really, the second case above is a special-case of this third case.

- - - -

The first step is to link your local project into the global install space. (See global vs local installation for more on this global/local business.)

- -

I do this as I’m developing node projects (including npm itself).

- -
cd ~/dev/js/node-tap  # go into the project dir
-npm link              # create symlinks into {prefix}
-
- -

Because of how I have my computer set up, with /usr/local as my install prefix, I end up with a symlink from /usr/local/lib/node_modules/tap pointing to ~/dev/js/node-tap, and the executable linked to /usr/local/bin/tap.

- -

Of course, if you set your paths differently, then you’ll have different results. (That’s why I tend to talk in terms of prefix rather than /usr/local.)

- - - -

When you want to link the globally-installed package into your local development folder, you run npm link pkg where pkg is the name of the package that you want to install.

- -

For example, let’s say that I wanted to write some tap tests for my node-glob package. I’d first do the steps above to link tap into the global install space, and then I’d do this:

- -
cd ~/dev/js/node-glob  # go to the project that uses the thing.
-npm link tap           # link the global thing into my project.
-
- -

Now when I make changes in ~/dev/js/node-tap, they’ll be immediately reflected in ~/dev/js/node-glob/node_modules/tap.

- - - -

Let’s say I have 15 sites that all use express. I want the benefits of local development, but I also want to be able to update all my dev folders at once. You can globally install express, and then link it into your local development folder.

- -
npm install express -g  # install express globally
-cd ~/dev/js/my-blog     # development folder one
-npm link express        # link the global express into ./node_modules
-cd ~/dev/js/photo-site  # other project folder
-npm link express        # link express into here, as well
-
-                        # time passes
-                        # TJ releases some new stuff.
-                        # you want this new stuff.
-
-npm update express -g   # update the global install.
-                        # this also updates my project folders.
-
- -

Caveat: Not For Real Servers

- -

npm link is a development tool. It’s awesome for managing packages on your local development box. But deploying with npm link is basically asking for problems, since it makes it super easy to update things without realizing it.

- -

Caveat 2: Sorry, Windows!

- -

I highly doubt that a native Windows node will ever have comparable symbolic link support to what Unix systems provide. I know that there are junctions and such, and I've heard legends about symbolic links on Windows 7.

- -

When there is a native windows port of Node, if that native windows port has `fs.symlink` and `fs.readlink` support that is exactly identical to the way that they work on Unix, then this should work fine.

- -

But I wouldn't hold my breath. Any bugs about this not working on a native Windows system (ie, not Cygwin) will most likely be closed with wontfix.

- - -

Aside: Credit where Credit’s Due

- -

Back before the Great Package Management Wars of Node 0.1, before npm or kiwi or mode or seed.js could do much of anything, and certainly before any of them had more than 2 users, Mikeal Rogers invited me to the Couch.io offices for lunch to talk about this npm registry thingie I’d mentioned wanting to build. (That is, to convince me to use CouchDB for it.)

- -

Since he was volunteering to build the first version of it, and since couch is pretty much the ideal candidate for this use-case, it was an easy sell.

- -

While I was there, he said, “Look. You need to be able to link a project directory as if it was installed as a package, and then have it all Just Work. Can you do that?”

- -

I was like, “Well, I don’t know… I mean, there’s these edge cases, and it doesn’t really fit with the existing folder structure very well…”

- -

“Dude. Either you do it, or I’m going to have to do it, and then there’ll be another package manager in node, instead of writing a registry for npm, and it won’t be as good anyway. Don’t be python.”

- -

The rest is history.

diff --git a/locale/fa/blog/npm/npm-1-0-released.md b/locale/fa/blog/npm/npm-1-0-released.md deleted file mode 100644 index abc105708d44..000000000000 --- a/locale/fa/blog/npm/npm-1-0-released.md +++ /dev/null @@ -1,39 +0,0 @@ ---- -title: "npm 1.0: Released" -author: Isaac Schlueter -date: 2011-05-01T15:09:45.000Z -status: publish -category: npm -slug: npm-1-0-released -layout: blog-post.hbs ---- - -

npm 1.0 has been released. Here are the highlights:

- - - -

The focus is on npm being a development tool, rather than an apt-wannabe.

- -

Installing it

- -

To get the new version, run this command:

- -
curl https://npmjs.com/install.sh | sh 
- -

This will prompt to ask you if it’s ok to remove all the old 0.x cruft. If you want to not be asked, then do this:

- -
curl https://npmjs.com/install.sh | clean=yes sh 
- -

Or, if you want to not do the cleanup, and leave the old stuff behind, then do this:

- -
curl https://npmjs.com/install.sh | clean=no sh 
- -

A lot of people in the node community were brave testers and helped make this release a lot better (and swifter) than it would have otherwise been. Thanks :)

- -

Code Freeze

- -

npm will not have any major feature enhancements or architectural changes for at least 6 months. There are interesting developments planned that leverage npm in some ways, but it’s time to let the client itself settle. Also, I want to focus attention on some other problems for a little while.

- -

Of course, bug reports are always welcome.

- -

See you at NodeConf!

diff --git a/locale/fa/blog/npm/npm-1-0-the-new-ls.md b/locale/fa/blog/npm/npm-1-0-the-new-ls.md deleted file mode 100644 index b2b72067e91f..000000000000 --- a/locale/fa/blog/npm/npm-1-0-the-new-ls.md +++ /dev/null @@ -1,147 +0,0 @@ ---- -title: "npm 1.0: The New 'ls'" -author: Isaac Schlueter -date: 2011-03-18T06:22:17.000Z -status: publish -category: npm -slug: npm-1-0-the-new-ls -layout: blog-post.hbs ---- - -

This is the first in a series of hopefully more than 1 posts, each detailing some aspect of npm 1.0.

- -

In npm 0.x, the ls command was a combination of both searching the registry as well as reporting on what you have installed.

- -

As the registry has grown in size, this has gotten unwieldy. Also, since npm 1.0 manages dependencies differently, nesting them in node_modules folder and installing locally by default, there are different things that you want to view.

- -

The functionality of the ls command was split into two different parts. search is now the way to find things on the registry (and it only reports one line per package, instead of one line per version), and ls shows a tree view of the packages that are installed locally.

- -

Here’s an example of the output:

- -
$ npm ls
-npm@1.0.0 /Users/isaacs/dev-src/js/npm
-├── semver@1.0.1 
-├─┬ ronn@0.3.5 
-│ └── opts@1.2.1 
-└─┬ express@2.0.0rc3 extraneous 
-  ├─┬ connect@1.1.0 
-  │ ├── qs@0.0.7 
-  │ └── mime@1.2.1 
-  ├── mime@1.2.1 
-  └── qs@0.0.7
-
- -

This is after I’ve done npm install semver ronn express in the npm source directory. Since express isn’t actually a dependency of npm, it shows up with that “extraneous” marker.

- -

Let’s see what happens when we create a broken situation:

- -
$ rm -rf ./node_modules/express/node_modules/connect
-$ npm ls
-npm@1.0.0 /Users/isaacs/dev-src/js/npm
-├── semver@1.0.1 
-├─┬ ronn@0.3.5 
-│ └── opts@1.2.1 
-└─┬ express@2.0.0rc3 extraneous 
-  ├── UNMET DEPENDENCY connect >= 1.1.0 < 2.0.0
-  ├── mime@1.2.1 
-  └── qs@0.0.7
-
- -

Tree views are great for human readability, but some times you want to pipe that stuff to another program. For that output, I took the same datastructure, but instead of building up a treeview string for each line, it spits out just the folders like this:

- -
$ npm ls -p
-/Users/isaacs/dev-src/js/npm
-/Users/isaacs/dev-src/js/npm/node_modules/semver
-/Users/isaacs/dev-src/js/npm/node_modules/ronn
-/Users/isaacs/dev-src/js/npm/node_modules/ronn/node_modules/opts
-/Users/isaacs/dev-src/js/npm/node_modules/express
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect/node_modules/qs
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect/node_modules/mime
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/mime
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/qs
-
- -

Since you sometimes want a bigger view, I added the --long option to (shorthand: -l) to spit out more info:

- -
$ npm ls -l
-npm@1.0.0 
-│ /Users/isaacs/dev-src/js/npm
-│ A package manager for node
-│ git://github.com/isaacs/npm.git
-│ https://npmjs.com/
-├── semver@1.0.1 
-│   ./node_modules/semver
-│   The semantic version parser used by npm.
-│   git://github.com/isaacs/node-semver.git
-├─┬ ronn@0.3.5 
-│ │ ./node_modules/ronn
-│ │ markdown to roff and html converter
-│ └── opts@1.2.1 
-│     ./node_modules/ronn/node_modules/opts
-│     Command line argument parser written in the style of commonjs. To be used with node.js
-└─┬ express@2.0.0rc3 extraneous 
-  │ ./node_modules/express
-  │ Sinatra inspired web development framework
-  ├─┬ connect@1.1.0 
-  │ │ ./node_modules/express/node_modules/connect
-  │ │ High performance middleware framework
-  │ │ git://github.com/senchalabs/connect.git
-  │ ├── qs@0.0.7 
-  │ │   ./node_modules/express/node_modules/connect/node_modules/qs
-  │ │   querystring parser
-  │ └── mime@1.2.1 
-  │     ./node_modules/express/node_modules/connect/node_modules/mime
-  │     A comprehensive library for mime-type mapping
-  ├── mime@1.2.1 
-  │   ./node_modules/express/node_modules/mime
-  │   A comprehensive library for mime-type mapping
-  └── qs@0.0.7 
-      ./node_modules/express/node_modules/qs
-      querystring parser
-
-$ npm ls -lp
-/Users/isaacs/dev-src/js/npm:npm@1.0.0::::
-/Users/isaacs/dev-src/js/npm/node_modules/semver:semver@1.0.1::::
-/Users/isaacs/dev-src/js/npm/node_modules/ronn:ronn@0.3.5::::
-/Users/isaacs/dev-src/js/npm/node_modules/ronn/node_modules/opts:opts@1.2.1::::
-/Users/isaacs/dev-src/js/npm/node_modules/express:express@2.0.0rc3:EXTRANEOUS:::
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect:connect@1.1.0::::
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect/node_modules/qs:qs@0.0.7::::
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/connect/node_modules/mime:mime@1.2.1::::
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/mime:mime@1.2.1::::
-/Users/isaacs/dev-src/js/npm/node_modules/express/node_modules/qs:qs@0.0.7::::
-
- -

And, if you want to get at the globally-installed modules, you can use ls with the global flag:

- -
$ npm ls -g
-/usr/local
-├─┬ A@1.2.3 -> /Users/isaacs/dev-src/js/A
-│ ├── B@1.2.3 -> /Users/isaacs/dev-src/js/B
-│ └─┬ npm@0.3.15 
-│   └── semver@1.0.1 
-├─┬ B@1.2.3 -> /Users/isaacs/dev-src/js/B
-│ └── A@1.2.3 -> /Users/isaacs/dev-src/js/A
-├── glob@2.0.5 
-├─┬ npm@1.0.0 -> /Users/isaacs/dev-src/js/npm
-│ ├── semver@1.0.1 
-│ └─┬ ronn@0.3.5 
-│   └── opts@1.2.1 
-└── supervisor@0.1.2 -> /Users/isaacs/dev-src/js/node-supervisor
-
-$ npm ls -gpl
-/usr/local:::::
-/usr/local/lib/node_modules/A:A@1.2.3::::/Users/isaacs/dev-src/js/A
-/usr/local/lib/node_modules/A/node_modules/npm:npm@0.3.15::::/Users/isaacs/dev-src/js/A/node_modules/npm
-/usr/local/lib/node_modules/A/node_modules/npm/node_modules/semver:semver@1.0.1::::/Users/isaacs/dev-src/js/A/node_modules/npm/node_modules/semver
-/usr/local/lib/node_modules/B:B@1.2.3::::/Users/isaacs/dev-src/js/B
-/usr/local/lib/node_modules/glob:glob@2.0.5::::
-/usr/local/lib/node_modules/npm:npm@1.0.0::::/Users/isaacs/dev-src/js/npm
-/usr/local/lib/node_modules/npm/node_modules/semver:semver@1.0.1::::/Users/isaacs/dev-src/js/npm/node_modules/semver
-/usr/local/lib/node_modules/npm/node_modules/ronn:ronn@0.3.5::::/Users/isaacs/dev-src/js/npm/node_modules/ronn
-/usr/local/lib/node_modules/npm/node_modules/ronn/node_modules/opts:opts@1.2.1::::/Users/isaacs/dev-src/js/npm/node_modules/ronn/node_modules/opts
-/usr/local/lib/node_modules/supervisor:supervisor@0.1.2::::/Users/isaacs/dev-src/js/node-supervisor
-
- -

Those -> flags are indications that the package is link-installed, which will be covered in the next installment.

diff --git a/locale/fa/blog/npm/peer-dependencies.md b/locale/fa/blog/npm/peer-dependencies.md deleted file mode 100644 index 76aa425650b8..000000000000 --- a/locale/fa/blog/npm/peer-dependencies.md +++ /dev/null @@ -1,141 +0,0 @@ ---- -category: npm -title: Peer Dependencies -date: 2013-02-08T00:00:00.000Z -author: Domenic Denicola -slug: peer-dependencies -layout: blog-post.hbs ---- - -Reposted from [Domenic's -blog](http://domenic.me/2013/02/08/peer-dependencies/) with -permission. Thanks! - -npm is awesome as a package manager. In particular, it handles sub-dependencies very well: if my package depends on -`request` version 2 and `some-other-library`, but `some-other-library` depends on `request` version 1, the resulting -dependency graph looks like: - -``` -├── request@2.12.0 -└─┬ some-other-library@1.2.3 - └── request@1.9.9 -``` - -This is, generally, great: now `some-other-library` has its own copy of `request` v1 that it can use, while not -interfering with my package's v2 copy. Everyone's code works! - -## The Problem: Plugins - -There's one use case where this falls down, however: *plugins*. A plugin package is meant to be used with another "host" -package, even though it does not always directly *use* the host package. There are many examples of this pattern in the -Node.js package ecosystem already: - -- Grunt [plugins](http://gruntjs.com/#plugins-all) -- Chai [plugins](http://chaijs.com/plugins) -- LevelUP [plugins](https://github.com/rvagg/node-levelup/wiki/Modules) -- Express [middleware](http://expressjs.com/api.html#middleware) -- Winston [transports](https://github.com/flatiron/winston/blob/master/docs/transports.md) - -Even if you're not familiar with any of those use cases, surely you recall "jQuery plugins" from back when you were a -client-side developer: little `