Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

protocol/SFTP: Manual error emit on write error when autoClose = true. #1091

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

vorble
Copy link

@vorble vorble commented Nov 20, 2021

When using an SFTP writable stream with autoClose = false, errors during the writing process are relayed to the error listeners as they are reported via callback from WriteStream.prototype._write. When autoClose = true, the errors don't make it to listeners because of the call to this.destroy(). This change passes the error to the call to this.destroy() to let it be received by the listener.

@vorble
Copy link
Author

vorble commented Dec 11, 2021

I've been working on a test case to ensure write stream errors are seen by library users when created with autoClose set to true (the default), but I haven't been able to replicate the issue that prompted this pull request in that context. The passing test case looks like this https://gist.github.com/vorble/e3331abd6617fa8c9ae1dd177fbfcffa .

Using nodejs v14.15.1 (also v10, v16) and ssh2 1.5.0, I attempt to transfer a file to a server that can't hold it all using .pipe(). The server reports SSH-2.0-OpenSSH_7.9p1 Debian-10+deb10u2 and is configured with a very small storage space for the test mounted at space. Here's the proof of concept (you have to comment out either test case to observe the difference):

'use strict';

const { Client } = require('ssh2');
const fs = require('fs');

if (!process.env.REMOTE_HOST) {
  console.error('Please specify REMOTE_HOST environment variable for the host to SSH to.');
  process.exit(1);
} else if (!process.env.SSH_AUTH_SOCK) {
  console.error('Please start your SSH agent.');
  process.exit(1); // Disable if you don't want to use an agent.
}

// SSH connection settings.
const settings = {
  host: process.env.REMOTE_HOST,
  username: process.env.USER,
  agent: process.env.SSH_AUTH_SOCK, // Change these settings if you don't want to use an agent.
};

// Create this file with: dd if=/dev/zero of=some_zeros bs=1M count=64
// Adjust the parameters so the size is larger than the destination can hold.
const fin = fs.createReadStream('some_zeros');

const conn = new Client();
conn.on('ready', () => {
  console.log('Client :: ready');
  conn.sftp((err, sftp) => {
    if (err)
      throw err;

    // CASE A (choose either case A or B)
    // ----------------------------------
    // This causes an error to show up in the error handler when the transmission
    // fails partway through. The error handler closes the stream with destroy().
    const fout = sftp.createWriteStream('space/myfile', { autoClose: false });

    // CASE B (choose either case A or B)
    // ----------------------------------
    // autoClose is true by default. Even when the transmission fails partway through,
    // the error is not received by the error handler. The stream closes itself
    // and a truncated file is left on the destination.
    //const fout = sftp.createWriteStream('space/myfile');

    // Note: In either case, 'finish' never gets emitted since the transfer never finishes.
    // That could be used as an indicator of an incomplete transfer.

    fout.on('close', () => {
      console.log('fout got event \'close\'');
      conn.end();
    });
    fout.on('error', (error) => {
      // Error handler gets called in case A, but not in case B.
      console.log('fout got event \'error\':', error);
      fout.destroy();
    });
    fin.pipe(fout);
  });
}).connect(settings);

In the case where autoClose is true, I found the lack of an error surprising. I believe the behavior is intrinsic to the way Writeable streams are implemented. Here is a demonstration of the behavior of Writeable streams when destroying and while reporting a write error:

'use strict';

const { Writable } = require('stream');

// Creates a Writeable stream with an arbitrary write function and runs a test
// to ensure the write error is detected by the streams user.
function runTest(writeFunction) {
  let sawError = false;
  const fout = new Writable({
    write(chunk, encoding, callback) {
      console.log('in write()');
      writeFunction.call(this, chunk, encoding, callback);
    }
  });
  fout.on('error', (err) => {
    console.error('fout got error:', err);
    sawError = true;
  });
  fout.on('close', () => {
    console.log('fout got close');
    if (sawError) {
      console.log('SUCCESS: Error was detected.');
    } else {
      console.log('FAIL: Error was not detected.');
    }
  });
  fout.write('data');
}

runTest(function(chunk, encoding, callback) {
  console.log('Testing 1: Destroy, then call callback with error.');
  this.destroy(); // Swallows error.
  callback(new Error('Error'));
});

runTest(function(chunk, encoding, callback) {
  console.log('Testing 2: Call callback with error, then destroy.');
  callback(new Error('Error'));
  this.destroy(); // Swallows error.
});

runTest(function(chunk, encoding, callback) {
  console.log('Testing 3: Call callback with error, then destroy next tick.');
  callback(new Error('Error'));
  process.nextTick(this.destroy.bind(this));
});

runTest(function(chunk, encoding, callback) {
  console.log('Testing 4: Emit error, destroy, then call callback with error.');
  this.emit('error', new Error('Error'));
  this.destroy();
  callback(new Error('Error'));
});

The output is a little messy, but the last two tests succeed by detecting the error. I felt like emitting the error was the better of the two.

@@ -3695,8 +3695,10 @@ WriteStream.prototype._write = function(data, encoding, cb) {
this.pos,
(er, bytes) => {
if (er) {
if (this.autoClose)
if (this.autoClose) {
this.emit('error', er); // er is swalloed by destroy(), emit manually.
this.destroy();
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we just instead do something like:

Suggested change
this.destroy();
this.destroy(er);

that should have the same effect.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, thank you. I've updated the branch and tested.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants