Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 486 Vote(s) - 3.57 Average
  • 1
  • 2
  • 3
  • 4
  • 5
PowerShell's pipe adds linefeed

#1
I'm trying to pipe a string into a program's STDIN *without* any trailing linefeeds (unless that string itself actually ends in a linefeed). I tried googling around, but I only found people trying to print to the *console* without a trailing linefeed, in which case `Write-Host` takes a parameter `-NoNewLine`. However, to pipe it on to another program, I need `Write-Output` or similar which doesn't have such a parameter. Now it seems like `Write-Output` isn't even the problem:

Z:\> (Write-Output "abc").Length
3

But as soon as I pipe it to another program and read the string there, I get an additional linefeed. For instance, I tried this Ruby snippet:

Z:\> Write-Output "abc" | ruby -e "p ARGF.read"
"abc\n"

I checked that the actual string received is `abc\n`. The same happens in several other languages (at least C#, Java and Python), so I believe it's an issue with PowerShell, not the language doing the reading.

As a further test, I replaced `Write-Output` itself with another Ruby script:

Z:\> ruby -e "$> << 'abc'"
abcZ:\>

(That is, there is definitely no `\n` on the script's STDOUT.)

But again, when I pipe it into another script:

Z:\> ruby -e "$> << 'abc'" | ruby -e "p ARGF.read"
"abc\n"

I'm fairly convinced that it's the pipe which adds the linefeed. How do I avoid that? I actually want to be able to control whether the input ends in a linefeed or not (by including it in the input or omitting it).

(For reference, I also tested strings which already contain a trailing linefeed, and in that case the pipe *doesn't* add another one, so I guess it just ensures a trailing linefeed.)

I originally encountered this in PowerShell v3, but I'm now using v5 and still have the same issue.
Reply

#2
Bruteforce approach: feed binary data to process' stdin. I've tested this code on the `cat.exe` from [UnixUtils][1] and it seems to do what you want:

<!-- language: lang-powershell -->

# Text to send
$InputVar = "No Newline, No NewLine,`nNewLine, No NewLine,`nNewLine, No NewLine"

# Buffer & initial size of MemoryStream
$BufferSize = 4096

# Convert text to bytes and write to MemoryStream
[byte[]]$InputBytes = [Text.Encoding]::UTF8.GetBytes($InputVar)
$MemStream = New-Object -TypeName System.IO.MemoryStream -ArgumentList $BufferSize
$MemStream.Write($InputBytes, 0, $InputBytes.Length)
[Void]$MemStream.Seek(0, 'Begin')

# Setup stdin\stdout redirection for our process
$StartInfo = New-Object -TypeName System.Diagnostics.ProcessStartInfo -Property @{
FileName = 'MyLittle.exe'
UseShellExecute = $false
RedirectStandardInput = $true
}

# Create new process
$Process = New-Object -TypeName System.Diagnostics.Process

# Assign previously created StartInfo properties
$Process.StartInfo = $StartInfo
# Start process
[void]$Process.Start()

# Pipe data
$Buffer = New-Object -TypeName byte[] -ArgumentList $BufferSize
$StdinStream = $Process.StandardInput.BaseStream

try
{
do
{
$ReadCount = $MemStream.Read($Buffer, 0, $Buffer.Length)
$StdinStream.Write($Buffer, 0, $ReadCount)
$StdinStream.Flush()
}
while($ReadCount -gt 0)
}
catch
{
throw 'Houston, we have a problem!'
}
finally
{
# Close streams
$StdinStream.Close()
$MemStream.Close()
}

# Cleanup
'Process', 'StdinStream', 'MemStream' |
ForEach-Object {
(Get-Variable $_ -ValueOnly).Dispose()
Remove-Variable $_ -Force
}


[1]:

[To see links please register here]

Reply

#3
## Introduction

Here is my `Invoke-RawPipeline` function (get latest version from [this Gist)]().

Use it to pipe binary data between processes' Standard Output and Standard Input streams. It can read input stream from file/pipeline and save resulting output stream to file.

It requires [PsAsync module](

[To see links please register here]

) to be able to launch and pipe data in multiple processes.

In case of issues use `-Verbose` switch to see debug output.

## Examples

### Redirecting to file

* Batch:
`findstr.exe /C:"Warning" /I C:\Windows\WindowsUpdate.log > C:\WU_Warnings.txt`
* PowerShell:
`Invoke-RawPipeline -Command @{Path = 'findstr.exe' ; Arguments = '/C:"Warning" /I C:\Windows\WindowsUpdate.log'} -OutFile 'C:\WU_Warnings.txt'`

### Redirecting from file

* Batch:
`svnadmin load < C:\RepoDumps\MyRepo.dump`
* PowerShell:
`Invoke-RawPipeline -InFile 'C:\RepoDumps\MyRepo.dump' -Command @{Path = 'svnadmin.exe' ; Arguments = 'load'}`

### Piping strings

* Batch:
`echo TestString | find /I "test" > C:\SearchResult.log`
* PowerShell:
`'TestString' | Invoke-RawPipeline -Command @{Path = 'find.exe' ; Arguments = '/I "test"'} -OutFile 'C:\SearchResult.log'`

### Piping between multiple processes

* Batch:
`ipconfig | findstr /C:"IPv4 Address" /I`
* PowerShell:
`Invoke-RawPipeline -Command @{Path = 'ipconfig'}, @{Path = 'findstr' ; Arguments = '/C:"IPv4 Address" /I'} -RawData`


## Code:

<!-- language: lang-powershell -->

<#
.Synopsis
Pipe binary data between processes' Standard Output and Standard Input streams.
Can read input stream from file and save resulting output stream to file.

.Description
Pipe binary data between processes' Standard Output and Standard Input streams.
Can read input stream from file/pipeline and save resulting output stream to file.
Requires PsAsync module:

[To see links please register here]


.Notes
Author: beatcracker (

[To see links please register here]

,

[To see links please register here]

)
License: Microsoft Public License (

[To see links please register here]

)

.Component
Requires PsAsync module:

[To see links please register here]


.Parameter Command
An array of hashtables, each containing Command Name, Working Directory and Arguments

.Parameter InFile
This parameter is optional.

A string representing path to file, to read input stream from.

.Parameter OutFile
This parameter is optional.

A string representing path to file, to save resulting output stream to.

.Parameter Append
This parameter is optional. Default is false.

A switch controlling wheither ovewrite or append output file if it already exists. Default is to overwrite.

.Parameter IoTimeout
This parameter is optional. Default is 0.

A number of seconds to wait if Input/Output streams are blocked. Default is to wait indefinetely.

.Parameter ProcessTimeout
This parameter is optional. Default is 0.

A number of seconds to wait for process to exit after finishing all pipeline operations. Default is to wait indefinetely.
Details:

[To see links please register here]


.Parameter BufferSize
This parameter is optional. Default is 4096.

Size of buffer in bytes for read\write operations. Supports standard Powershell multipliers: KB, MB, GB, TB, and PB.
Total number of buffers is: Command.Count * 2 + InFile + OutFile.

.Parameter ForceGC
This parameter is optional.

A switch, that if specified will force .Net garbage collection.
Use to immediately release memory on function exit, if large buffer size was used.

.Parameter RawData
This parameter is optional.

By default function returns object with StdOut/StdErr streams and process' exit codes.
If this switch is specified, function will return raw Standard Output stream.

.Example
Invoke-RawPipeline -Command @{Path = 'findstr.exe' ; Arguments = '/C:"Warning" /I C:\Windows\WindowsUpdate.log'} -OutFile 'C:\WU_Warnings.txt'

Batch analog: findstr.exe /C:"Warning" /I C:\Windows\WindowsUpdate.log' > C:\WU_Warnings.txt

.Example
Invoke-RawPipeline -Command @{Path = 'findstr.exe' ; WorkingDirectory = 'C:\Windows' ; Arguments = '/C:"Warning" /I .\WindowsUpdate.log'} -RawData

Batch analog: cd /D C:\Windows && findstr.exe /C:"Warning" /I .\WindowsUpdate.log

.Example
'TestString' | Invoke-RawPipeline -Command @{Path = 'find.exe' ; Arguments = '/I "test"'} -OutFile 'C:\SearchResult.log'

Batch analog: echo TestString | find /I "test" > C:\SearchResult.log

.Example
Invoke-RawPipeline -Command @{Path = 'ipconfig'}, @{Path = 'findstr' ; Arguments = '/C:"IPv4 Address" /I'} -RawData

Batch analog: ipconfig | findstr /C:"IPv4 Address" /I

.Example
Invoke-RawPipeline -InFile 'C:\RepoDumps\Repo.svn' -Command @{Path = 'svnadmin.exe' ; Arguments = 'load'}

Batch analog: svnadmin load < C:\RepoDumps\MyRepo.dump
#>

function Invoke-RawPipeline
{
[CmdletBinding()]
Param
(
[Parameter(ValueFromPipeline = $true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({
if($_.psobject.Methods.Match.('ToString'))
{
$true
}
else
{
throw 'Can''t convert pipeline object to string!'
}
})]
$InVariable,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[ValidateScript({
$_ | ForEach-Object {
$Path = $_.Path
$WorkingDirectory = $_.WorkingDirectory

if(!(Get-Command -Name $Path -CommandType Application -ErrorAction SilentlyContinue))
{
throw "Command not found: $Path"
}

if($WorkingDirectory)
{
if(!(Test-Path -LiteralPath $WorkingDirectory -PathType Container -ErrorAction SilentlyContinue))
{
throw "Working directory not found: $WorkingDirectory"
}
}
}
$true
})]
[ValidateNotNullOrEmpty()]
[array]$Command,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[ValidateScript({
if(!(Test-Path -LiteralPath $_))
{
throw "File not found: $_"
}
$true
})]
[ValidateNotNullOrEmpty()]
[string]$InFile,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[ValidateScript({
if(!(Test-Path -LiteralPath (Split-Path $_)))
{
throw "Folder not found: $_"
}
$true
})]
[ValidateNotNullOrEmpty()]
[string]$OutFile,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[switch]$Append,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[ValidateRange(0, 2147483)]
[int]$IoTimeout = 0,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[ValidateRange(0, 2147483)]
[int]$ProcessTimeout = 0,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[long]$BufferSize = 4096,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[switch]$RawData,

[Parameter(ValueFromPipelineByPropertyName = $true)]
[switch]$ForceGC
)

Begin
{

$Modules = @{PsAsync = 'http://psasync.codeplex.com'}

'Loading modules:', ($Modules | Format-Table -HideTableHeaders -AutoSize | Out-String) | Write-Verbose

foreach($module in $Modules.GetEnumerator())
{
if(!(Get-Module -Name $module.Key))
{
Try
{
Import-Module -Name $module.Key -ErrorAction Stop
}
Catch
{
throw "$($module.Key) module not available. Get it here: $($module.Value)"
}
}
}

function New-ConsoleProcess
{
Param
(
[string]$Path,
[string]$Arguments,
[string]$WorkingDirectory,
[switch]$CreateNoWindow = $true,
[switch]$RedirectStdIn = $true,
[switch]$RedirectStdOut = $true,
[switch]$RedirectStdErr = $true
)

if(!$WorkingDirectory)
{
if(!$script:MyInvocation.MyCommand.Path)
{
$WorkingDirectory = [System.AppDomain]::CurrentDomain.BaseDirectory
}
else
{
$WorkingDirectory = Split-Path $script:MyInvocation.MyCommand.Path
}
}

Try
{
$ps = New-Object -TypeName System.Diagnostics.Process -ErrorAction Stop
$ps.StartInfo.Filename = $Path
$ps.StartInfo.Arguments = $Arguments
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardInput = $RedirectStdIn
$ps.StartInfo.RedirectStandardOutput = $RedirectStdOut
$ps.StartInfo.RedirectStandardError = $RedirectStdErr
$ps.StartInfo.CreateNoWindow = $CreateNoWindow
$ps.StartInfo.WorkingDirectory = $WorkingDirectory
}
Catch
{
throw $_
}

return $ps
}

function Invoke-GarbageCollection
{
[gc]::Collect()
[gc]::WaitForPendingFinalizers()
}

$CleanUp = {
$IoWorkers + $StdErrWorkers |
ForEach-Object {
$_.Src, $_.Dst |
ForEach-Object {
if(!($_ -is [System.Diagnostics.Process]))
{
Try
{
$_.Close()
}
Catch
{
Write-Error "Failed to close $_"
}
$_.Dispose()
}

}
}
}

$PumpData = {
Param
(
[hashtable]$Cfg
)
# Fail hard, we don't want stuck threads
$Private:ErrorActionPreference = 'Stop'

$Src = $Cfg.Src
$SrcEndpoint = $Cfg.SrcEndpoint
$Dst = $Cfg.Dst
$DstEndpoint = $Cfg.DstEndpoint
$BufferSize = $Cfg.BufferSize
$SyncHash = $Cfg.SyncHash
$RunspaceId = $Cfg.Id

# Setup Input and Output streams
if($Src -is [System.Diagnostics.Process])
{
switch ($SrcEndpoint)
{
'StdOut' {$InStream = $Src.StandardOutput.BaseStream}
'StdIn' {$InStream = $Src.StandardInput.BaseStream}
'StdErr' {$InStream = $Src.StandardError.BaseStream}
default {throw "Not valid source endpoint: $_"}
}
}
else
{
$InStream = $Src
}

if($Dst -is [System.Diagnostics.Process])
{
switch ($DstEndpoint)
{
'StdOut' {$OutStream = $Dst.StandardOutput.BaseStream}
'StdIn' {$OutStream = $Dst.StandardInput.BaseStream}
'StdErr' {$OutStream = $Dst.StandardError.BaseStream}
default {throw "Not valid destination endpoint: $_"}
}
}
else
{
$OutStream = $Dst
}

$InStream | Out-String | ForEach-Object {$SyncHash.$RunspaceId.Status += "InStream: $_"}
$OutStream | Out-String | ForEach-Object {$SyncHash.$RunspaceId.Status += "OutStream: $_"}

# Main data copy loop
$Buffer = New-Object -TypeName byte[] $BufferSize
$BytesThru = 0

Try
{
Do
{
$SyncHash.$RunspaceId.IoStartTime = [DateTime]::UtcNow.Ticks
$ReadCount = $InStream.Read($Buffer, 0, $Buffer.Length)
$OutStream.Write($Buffer, 0, $ReadCount)
$OutStream.Flush()
$BytesThru += $ReadCount
}
While($readCount -gt 0)
}
Catch
{
$SyncHash.$RunspaceId.Status += $_

}
Finally
{
$OutStream.Close()
$InStream.Close()
}
}
}

Process
{
$PsCommand = @()
if($Command.Length)
{
Write-Verbose 'Creating new process objects'
$i = 0
foreach($cmd in $Command.GetEnumerator())
{
$PsCommand += New-ConsoleProcess @cmd
$i++
}
}

Write-Verbose 'Building I\O pipeline'
$PipeLine = @()
if($InVariable)
{
[Byte[]]$InVarBytes = [Text.Encoding]::UTF8.GetBytes($InVariable.ToString())
$PipeLine += New-Object -TypeName System.IO.MemoryStream -ArgumentList $BufferSize -ErrorAction Stop
$PipeLine[-1].Write($InVarBytes, 0, $InVarBytes.Length)
[Void]$PipeLine[-1].Seek(0, 'Begin')
}
elseif($InFile)
{
$PipeLine += New-Object -TypeName System.IO.FileStream -ArgumentList ($InFile, [IO.FileMode]::Open) -ErrorAction Stop
if($PsCommand.Length)
{
$PsCommand[0].StartInfo.RedirectStandardInput = $true
}
}
else
{
if($PsCommand.Length)
{
$PsCommand[0].StartInfo.RedirectStandardInput = $false
}
}

$PipeLine += $PsCommand

if($OutFile)
{
if($PsCommand.Length)
{
$PsCommand[-1].StartInfo.RedirectStandardOutput = $true
}

if($Append)
{
$FileMode = [System.IO.FileMode]::Append
}
else
{
$FileMode = [System.IO.FileMode]::Create
}

$PipeLine += New-Object -TypeName System.IO.FileStream -ArgumentList ($OutFile, $FileMode, [System.IO.FileAccess]::Write) -ErrorAction Stop
}
else
{
if($PsCommand.Length)
{
$PipeLine += New-Object -TypeName System.IO.MemoryStream -ArgumentList $BufferSize -ErrorAction Stop
}
}

Write-Verbose 'Creating I\O threads'
$IoWorkers = @()
for($i=0 ; $i -lt ($PipeLine.Length-1) ; $i++)
{
$SrcEndpoint = $DstEndpoint = $null
if($PipeLine[$i] -is [System.Diagnostics.Process])
{
$SrcEndpoint = 'StdOut'
}
if($PipeLine[$i+1] -is [System.Diagnostics.Process])
{
$DstEndpoint = 'StdIn'
}

$IoWorkers += @{
Src = $PipeLine[$i]
SrcEndpoint = $SrcEndpoint
Dst = $PipeLine[$i+1]
DstEndpoint = $DstEndpoint
}
}
Write-Verbose "Created $($IoWorkers.Length) I\O worker objects"

Write-Verbose 'Creating StdErr readers'
$StdErrWorkers = @()
for($i=0 ; $i -lt $PsCommand.Length ; $i++)
{
$StdErrWorkers += @{
Src = $PsCommand[$i]
SrcEndpoint = 'StdErr'
Dst = New-Object -TypeName System.IO.MemoryStream -ArgumentList $BufferSize -ErrorAction Stop
}
}
Write-Verbose "Created $($StdErrWorkers.Length) StdErr reader objects"

Write-Verbose 'Starting processes'
$PsCommand |
ForEach-Object {
$ps = $_
Try
{
[void]$ps.Start()
}
Catch
{
Write-Error "Failed to start process: $($ps.StartInfo.FileName)"
Write-Verbose "Can't launch process, killing and disposing all"

if($PsCommand)
{
$PsCommand |
ForEach-Object {
Try{$_.Kill()}Catch{} # Can't do much if kill fails...
$_.Dispose()
}
}

Write-Verbose 'Closing and disposing I\O streams'
. $CleanUp
}
Write-Verbose "Started new process: Name=$($ps.Name), Id=$($ps.Id)"
}

$WorkersCount = $IoWorkers.Length + $StdErrWorkers.Length
Write-Verbose 'Creating sync hashtable'
$sync = @{}
for($i=0 ; $i -lt $WorkersCount ; $i++)
{
$sync += @{$i = @{IoStartTime = $nul ; Status = $null}}
}
$SyncHash = [hashtable]::Synchronized($sync)

Write-Verbose 'Creating runspace pool'
$RunspacePool = Get-RunspacePool $WorkersCount

Write-Verbose 'Loading workers on the runspace pool'
$AsyncPipelines = @()
$i = 0
$IoWorkers + $StdErrWorkers |
ForEach-Object {
$Param = @{
BufferSize = $BufferSize
Id = $i
SyncHash = $SyncHash
} + $_

$AsyncPipelines += Invoke-Async -RunspacePool $RunspacePool -ScriptBlock $PumpData -Parameters $Param
$i++

Write-Verbose 'Started working thread'
$Param | Format-Table -HideTableHeaders -AutoSize | Out-String | Write-Debug
}

Write-Verbose 'Waiting for I\O to complete...'
if($IoTimeout){Write-Verbose "Timeout is $IoTimeout seconds"}

Do
{
# Check for pipelines with errors
[array]$FailedPipelines = Receive-AsyncStatus -Pipelines $AsyncPipelines | Where-Object {$_.Completed -and $_.Error}
if($FailedPipelines)
{
"$($FailedPipelines.Length) pipeline(s) failed!",
($FailedPipelines | Select-Object -ExpandProperty Error | Format-Table -AutoSize | Out-String) | Write-Debug
}

if($IoTimeout)
{
# Compare I\O start time of thread with current time
[array]$LockedReaders = $SyncHash.Keys | Where-Object {[TimeSpan]::FromTicks([DateTime]::UtcNow.Ticks - $SyncHash.$_.IoStartTime).TotalSeconds -gt $IoTimeout}
if($LockedReaders)
{
# Yikes, someone is stuck
"$($LockedReaders.Length) I\O operations reached timeout!" | Write-Verbose
$SyncHash.GetEnumerator() | ForEach-Object {"$($_.Key) = $($_.Value.Status)"} | Sort-Object | Out-String | Write-Debug
$PsCommand | ForEach-Object {
Write-Verbose "Killing process: Name=$($_.Name), Id=$($_.Id)"
Try
{
$_.Kill()
}
Catch
{
Write-Error 'Failed to kill process!'
}
}
break
}
}
Start-Sleep 1
}
While(Receive-AsyncStatus -Pipelines $AsyncPipelines | Where-Object {!$_.Completed}) # Loop until all pipelines are finished

Write-Verbose 'Waiting for all pipelines to finish...'
$IoStats = Receive-AsyncResults -Pipelines $AsyncPipelines
Write-Verbose 'All pipelines are finished'

Write-Verbose 'Collecting StdErr for all processes'
$PipeStdErr = $StdErrWorkers |
ForEach-Object {
$Encoding = $_.Src.StartInfo.StandardOutputEncoding
if(!$Encoding)
{
$Encoding = [System.Text.Encoding]::Default
}
@{
FileName = $_.Src.StartInfo.FileName
StdErr = $Encoding.GetString($_.Dst.ToArray())
ExitCode = $_.Src.ExitCode
}
} |
Select-Object @{Name = 'FileName' ; Expression = {$_.FileName}},
@{Name = 'StdErr' ; Expression = {$_.StdErr}},
@{Name = 'ExitCode' ; Expression = {$_.ExitCode}}

if($IoWorkers[-1].Dst -is [System.IO.MemoryStream])
{
Write-Verbose 'Collecting final pipeline output'
if($IoWorkers[-1].Src -is [System.Diagnostics.Process])
{
$Encoding = $IoWorkers[-1].Src.StartInfo.StandardOutputEncoding
}
if(!$Encoding)
{
$Encoding = [System.Text.Encoding]::Default
}
$PipeResult = $Encoding.GetString($IoWorkers[-1].Dst.ToArray())
}


Write-Verbose 'Closing and disposing I\O streams'
. $CleanUp

$PsCommand |
ForEach-Object {
$_.Refresh()
if(!$_.HasExited)
{
Write-Verbose "Process is still active: Name=$($_.Name), Id=$($_.Id)"
if(!$ProcessTimeout)
{
$ProcessTimeout = -1
}
else
{
$WaitForExitProcessTimeout = $ProcessTimeout * 1000
}
Write-Verbose "Waiting for process to exit (Process Timeout = $ProcessTimeout)"
if(!$_.WaitForExit($WaitForExitProcessTimeout))
{
Try
{
Write-Verbose 'Trying to kill it'
$_.Kill()
}
Catch
{
Write-Error "Failed to kill process $_"
}
}
}
Write-Verbose "Disposing process object: Name=$($_.StartInfo.FileName)"
$_.Dispose()
}

Write-Verbose 'Disposing runspace pool'
#

[To see links please register here]

$RunspacePool.Dispose()

if($ForceGC)
{
Write-Verbose 'Forcing garbage collection'
Invoke-GarbageCollection
}

if(!$RawData)
{
New-Object -TypeName psobject -Property @{Result = $PipeResult ; Status = $PipeStdErr}
}
else
{
$PipeResult
}
}
}
Reply

#4
I will admit to having zero experience with the ruby -e "puts ARGF.read" command you are using after the pipe, but I think I can prove that the pipe doesn't adding a newline.

# check length of string without newline after pipe
Write-Output "abc" | %{Write-Host "$_ has a length of: $($_.Length)" }

#check of string with newline length after pipe
Write-Output "def`n" | %{Write-Host "$($_.Length) is the length of $_" -NoNewline }

#write a string without newline (suppressing newline on Write-Host)
Write-Output 'abc' | %{ Write-Host $_ -NoNewline; }

#write a string with newline (suppressing newline on Write-Host)
Write-Output "def`n" | %{ Write-Host $_ -NoNewline; }

#write a final string without newline (suppressing newline on Write-Host)
Write-Output 'ghi' | %{ Write-Host $_ -NoNewline; }

This gives me an output of:

abc has a length of: 3
4 is the length of def
abcdef
ghi

I think you might want to start looking at the *ruby -e "put AGRF.read"* command and see if it is adding a newline after each read.

Reply

#5
Do it a simple way create a cmd process and execute it

$cmdArgs = @('/c','something.exe','arg1', .. , 'arg2' , $anotherArg , '<', '$somefile.txt' )
&'cmd.exe' $cmdArgs

Worked perfect for piping information into stdin that I wanted,
Reply

#6
To clear up a fundamental misconception in some of the comments: the "powershell commands" in a pipeline are cmdlets and each one runs within the process space of the *single* powershell. Thus, objects are being passed as is within the same process (on multiple threads) **UNLESS** you invoke an external command. Then the passed objects are converted to strings by the appropriate formatting cmdlet (if not already string objects). These strings are then converted to a stream of characters with each string having an appended \n. So it is not "the pipeline" adding the \n but the implicit conversion to text for input to the "legacy" command.

The basic problem in the question is that the asker is trying to get object like behaviour (e.g. a string with no trailing \n) on a character (byte) stream input. The standard input stream of a (console) process supplies characters (bytes) one at a time. The input *routines* collect these individual characters into a single string (typically) terminated when a \n is received. Whether the \n is returned as part of the string is up to the input routine. When the standard input stream is redirected to a file or pipe, the input routines mostly have no knowledge of this. So there is no way to determine the difference between a complete string with no \n and an incomplete string with more characters and the \n still to come.

Possible solutions (to the string delimiting problem, not the powershell added \n problem) would be to have some sort of time-out on the standard input reads. The end of a string could be signaled by no received characters for a certain time. Alternatively, if you had low enough level access to the pipe you could try to have atomic reads and writes. In this way a blocked read would return exactly what was written. Unfortunately, both of these methods have timing problems when running within a multitasking environment. If the delay is long then efficiency falls but if it is too short then it can be fooled by delays caused by process priority scheduling. Scheduling priorities can also interfere with atomic reads and writes if the writing process writes another line before the reading process has read the current one. It would need some sort of synchronizing system.

The only other way to signal that there are no more characters coming on the current line would be to close the pipe (EOF) but this is a once only method so you can only send one string (trailing \n or not). (This is how Ruby knows when the input is finished both in the initial example and in the `Invoke-RawPipeline` example.) It is possible that this is actually your intention (send only one string with or without trailing \n) in which case you could simply concatenate all the input (retaining or re-inserting any embedded \n) and throw away the last \n.

A possible solution to the powershell added \n problem for multiple strings would be to redefine your encoding of "a string" by terminating each string object with an otherwise invalid character sequence. \0 could be used if you had per character input (no good for C-like line input) otherwise maybe \377 (0xff). This would allow the input routines of your legacy command to "know" when a string ended. The sequence \0\n (or \377\n) would be the "end" of the string and everything before that (including a trailing \n or not, possibly using multiple reads) would be the string. I am assuming that you have some control over the input routines (e.g. you wrote the program) since any off the shelf program reading from standard input would typically be expecting a \n (or EOF) to delimit its input.
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through