Using experimental images of particles for generation of training data #181
-
Hello, The example of the code follows: IMAGE_SIZE = 256
pattern = np.load(file_name + '.npy')
class NanoParticle(Feature):
def get(self, image, position, intensity, **kwargs):
x, y = pattern.shape
image[position[0] - x // 2: position[0] + x - x // 2, position[1] - y // 2: position[1] + y - y // 2] = pattern / np.max(pattern) * intensity
return image
nanopar = NanoParticle(
position=lambda: np.random.randint(IMAGE_SIZE - 20, size = 2) + 10,
intensity=lambda: np.random.rand()*0.8 + 0.2
)
nanopars = nanopar^lambda: np.random.randint(100)
nanopars_noise = nanopars >> dt.NormalizeMinMax() >> dt.Poisson(snr = lambda: np.random.rand()*20 + 5, background = 0.01)
input_image = Image(np.zeros((IMAGE_SIZE, IMAGE_SIZE)))
output_image = nanopars_noise.resolve(input_image)
plt.imshow(output_image, cmap='gray') |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Sure! What do you want the network to do? Is it detection? |
Beta Was this translation helpful? Give feedback.
Ok! First, I'd change the pipeline slightly, to:
which allows you resolve the pipeline without giving it an argument.
Second, to get the mask, you can do:
image_and_mask
can be directly fed to a deeptrack unet, or resolved many ti…